Industry sponsors:
HOME | NOTEBOOKS | Tablets | Handhelds | Panels | Embedded | Rugged Definitions | Testing | Tech primers | Industry leaders | About us
Sponsors: Advantech | Dell Rugged | Getac | Handheld Group | Juniper Systems | MobileDemand
Sponsors: Motion Computing | Samwell Ruggedbook | Trimble | Winmate | Xplore Technologies

November 04, 2014

Intrinsically safe ecom Tab-Ex: another rugged tablet based on Samsung hardware

This morning I saw in the news that at the IFA Berlin show, ecom instruments launched what the press release called the "world's first Zone 1/Div. 1 tablet computer". New tablets are launched all the time, but I quickly realized that this was relevant for two reasons:

First, the Zone 1/Div. 1 designation means it's a tablet for use in hazardous locations. Zone 1, in the IEC/ATEX classification system that handles intrinsic safety issues, means the devices can safely be used in areas where there are flammable gasses are likely present. In essence, that requires that there's no chance that the device can generate sparks or other means that could lead to explosions. I'd need to look up what exactly Div. 1 refers to; there are two different entities handling this classifications, the North American National Electric Code (NEC) or the European ATEX directive.

Intrinsically safe devices, i.e. devices that are incapable of igniting gasses, dust or vapors, are very important in certain deployments, and so this new em tablet will certainly attract attention.

The second reason why this new em ecom Tab-Ex tablet is relevant is that it's another example of a stock consumer Samsung tablet inside a specially developed case. In May 2014 we took a detailed look at the N4 device from Two Technologies, which is a Samsung Galaxy Note II with a value-added rugged case that also included a second battery and a physical keyboard (see our review here). But whereas the Galaxy Note II is a 5.55-inch "phablet," the ecom tablet is based on the Samsung Galaxy Tab Active with a much larger 8-inch screen. And the Tab Active offers pretty impressive ruggedness specs even without any third party enhancements: It's IP67-sealed, it can handle four-foot drops, and its 400-nits backlight is bright enough to handle outdoor viewing. The Tab Active is a US$700 tablet and you can see its full specs here.

ecom isn't hiding the fact that their Tab-Ex is based on a Samsung tablet. Even the press release openly states that this type of solution "provides compatibility and a wide range of preloaded applications for a safer work environment, including unparalleled security functions like device encryption, MDM, VPn and secure connectivity (Samsung knOX)." And, perhaps even more importantly, that "being able to offer the same internal tablet, Samsung GAlAXY tab Active, ecom provides a key benefit of consistency in product use - whether Rugged, Zone 2 / Div. 2, Div. 1 or Zone 1. And, the software applications you develop for the Samsung Galaxy Tab Active will work unchanged on the Tab-EX01."

All of this makes the em Tab-Ex another interesting case in the current discussion on where the rugged computing industry should be moving to best take advantage of the worldwide popularity, and impressively high-tech, of consumer smartphones and tablets. As is, there are vehemently different opinions in the industry. Some feel that it makes perfect sense to pack readily available consumer technology into a value-added case whereas others feel that the guts of a rugged device had to be just as rugged, and the rugged market was inherently incompatible with the 6-month product/technology cycle in the consumer market.

Below you can see the ecom TabX on the left, and the donor Samsung Tab Active on the right.

And here are the ecom Tab-Ex product page and the Tab-Ex brochure.

Posted by conradb212 at 03:20 PM | Comments (0)

September 29, 2014

GoPro 4 -- the GoPro phenomenon, how it came about, and why it matters

On September 29, 2014, GoPro announced the GoPro Hero 4 and also a new basic GoPro camera. This is relevant for anyone dealing with ruggedized computing gear and using technology in the field for a number of reasons. But first, what exactly is GoPro and why do their products matter?

GoPro makes tiny little action cameras that, somehow, captured the public's imagination and are now found seemingly everywhere. You could almost say they've become a cultural phenomenon. GoPros are mounted on skydiver and motorcycle helmets, on race cars, on skateboards, on boats, on planes and drones, and seemingly on everything else, sometimes several of them. Now you can even mount them on dogs.

Why did this happen? GoPro cameras aren't particularly attractive — just featureless little boxes — and for several years they weren't very well known either. Initial markets were pretty much limited to what founder Nick Woodman had made the little camera for, surfers. Lore has it that since every surfer wants to be noticed and "go pro," that's where the name GoPro came from. Looks didn't matter as long as the camera could be mounted on a surfer or on a surf board, and as long as it could capture their exploits. Initial GoPros weren't particularly exceptional. Even by 2005, when increasingly capable digital cameras had long been available, the first GoPro was just a cheap Chinese-made plastic film camera. Digital video pioneered in the GoPro in 2006 in a little camera that looked remarkably what GoPros still look tray.

That, of course, was the era of the Flip camera with its USB port. Flip was wildly successful for a couple of years, but was then, for inexplicable reasons, bought by Cisco of all companies, which did not seem to have any interest in it and simply shut Flip down, apparently seeing no market for small inexpensive vidcams.

But two things were happening that made all the difference for fledgling GoPro. The first, of course, was the demise of Flip, which left lots of people wanting for a small handy action cam. The second was the technology convergence of immensely powerful new compression technology and inexpensive small storage media. Between those two technologies, it was suddenly possible to record glorious high definition video without the need for a big, bulky tape drive.

It's probably fair to say that without a company by the name of Ambarella, the GoPro revolution might never have taken place. That's because Ambarella provides the chips and intellectual property to process and compress high definition video so that hours of it can be recorded on inexpensive little SD and microSD storage cards.

So with a market pining for affordable, yet high-performance action cameras and Nick Woodman having the foresight to pack it all into his mighty little video cubes, the GoPro phenomenon was born. And that despite not even having been the first. RuggedPCReview editors reviewed the Liquid Image VideoMask and the Contour HD 1080p camcorder before, not being surfers, we even knew of the early GoPros.

Once we did get a hold of a GoPro Hero we published a detailed review entitled "GoPro Hero — the GoPro phenomenon: what the world-beating little 1080p vidcam can (and cannot) do" where we analyzed both the GoPro hardware and its performance in great detail. We were quite impressed with the "terrific HD video" and "stunning value" of the GoPro, but took major issue with GoPro's underwater housing that GoPro claimed was good to depths of 180 feet. The housing probably was, but its curved dome lens kept the camera from focusing underwater. Baffled we wrote in caps "YOU CANNOT USE THE CAMERA UNDERWATER BECAUSE IT WON'T FOCUS?!!?" Baffled was too mild a word, as we had used GoPros to record majestic giant Mantas hundreds of miles off the coast of Mexico in a once-in-a-lifetime trip, and the GoPro video was all blurry.

We reported in even greater detail on the GoPro Hero2, which still couldn't focus underwater. Apparently, the GoPro people were great surfers, but definitely not divers. The GoPro documentation now said "Please note that due to the curved lens of the HD HERO2 (and the original HD HERO) waterproof housing you will notice a slight loss of sharpness with underwater images." To which we responded in our report: "That is not good enough. It is NOT a slight loss. It makes the camera unusable underwater."

Our reviews, using third-party underwater housings that DID allow sharp focus, attracted GoPro's attention and we subsequently helped them with testing an underwater housing that worked and also underwater filters that addressed the early GoPros inability to adjust white balance to underwater conditions. Below is a brief video of the fun I had with a GoPro Hero2 and a pride of young sea lions off Coronado island in Mexico.

The GoPro Hero3 premiered October 2012 with an even better and less bulky underwater housing. The new line, whose body was even more compact than that of the Hero 1 and 2 and used tiny microSD cards, included a basic "White" model that performed at the Hero 1 level, a "Silver" Edition that was like an updated Hero2, and then there was the top-of-the-line "Black" Edition with a powerful Ambarella A7 chip that allowed recording 1080p video at 60 frames per second and, for the first time ever pretty much anywhere, 4K video, albeit only at 15 frames per second. The "Black" Edition could also shoot stills and shoot video simultaneously, and it had selectable white balance settings. We tested these new cameras both above and underwater and also shot many hours of underwater 3D video with GoPro's ingenious system of linking two cameras together.

About a year later came the Hero3+, a slightly smaller, fine-tuned version of the Hero3 that efficiently addressed a number of Hero3 omissions and shortcomings, such as better audio, close-ups, better low-light performance, etc.

And now, GoPro announced the Hero4, that includes the top-of-the-line Hero4 Black (US$499) and the Hero4 Silver (US$399), as well as a new basic camera, just called Hero (US$129). The most important features in the new models is that the Hero4 Black can do 4k video at a full 30 frames per second (and also 2.7k/50fps and 1080p/120). This means the camera is now capable of shooting usable 4k video and also high quality slow motion HD video. That could be a major boon to buyers of 4k HDTVs who find out that there really isn't any 4k content. The Hero4 Silver has roughly the same capabilities as the old Hero 3 Black but includes, a first for GoPros, an integrated LCD — prior GoPros could accommodate an optional snap-on LCD, but never came with an integrated one. The new US$129 entry-level Hero can do 1080p/30fps and 720p/60fps and is built directly into a waterproof housing.

What does that all mean? Overall, with the Hero4 GoPro addresses the criticism that its Hero3/3+ could only do 4k video at a technology demonstration level, as 15 frames per second is essentially useless. The new "Black" can do real 4k video, and the high res slow motion modes are very useful. The Hero4 will likely not only help GoPro to remain wildly successful, but also cement its reputation of being at the bleeding edge of technology. The new "Silver" camera takes roughly the place of the older 3/3+ Black, and with the new basic US$129 Hero, the company has an inexpensive, invulnerable and totally waterproof camera.

How does it all work? We don't know yet. We've had ample experience with numerous Hero 1, 2 and 3 models, but haven't been able to get any response from GoPro as of late. GoPro is now a public company whose stock premiered at US$24 a share on June 25, and has reached US$90 a share just three months later the morning of the Hero4 release.

What has made GoPro such a success? Like all phenomena it's a number of things that all came together at the right time. GoPros aren't pretty and they are difficult to use and their battery life is marginal, but they get the job done and then some. No one else thought of offering all the numerous (and inexpensive) mounting hardware that allows GoPros to go along for the ride virtually anywhere. No one else pushed recording of extreme sports as much as GoPro, and all using their signature ultra-wide angle. No other camera fueled the imagination of more videographers that used GoPros everywhere to show new angles and record what could not be recorded before. No one saw the potential of Ambarella's ground-breaking compression technology and inexpensive tiny storage cards as clearly as GoPro. And so here we are.

What does it mean for users of rugged computing gear? It means that stunningly powerful high-speed, high-definition recording capabilities are available inexpensively and in a form uniquely suited for field operations. In their cases that come standard with every GoPro, the little cameras are virtually indestructible and they can be used and mounted anywhere. Their recording performance is far above that of any camera integrated into any rugged handheld or tablet, and also above that of virtually all smartphones (and without the fear of breaking an expensive smartphone).

I think it's fair to say that virtually any job out there that benefits from using a rugged notebook, tablet or handheld would also benefit from bringing along a GoPro, or a few of them.

See GoPro's Hero4 press release

Posted by conradb212 at 06:14 PM | Comments (0)

September 16, 2014

The unpredictable nature of screen sizes

It's a mad, mad, mad world as far as the screen size of mobile devices goes. Witness...

For smartphones, 4.7 inches or so now seems the least customers will accept, and 5.5 inches or larger is better. When Apple introduced its iPhone 6 (4.7 inch) and iPhone 6+ (5.5 inch), the demand was such that both Apple's and AT&T's websites couldn't keep up. The AT&T website, in particular, was so messed up from all the pre-orders of giant phones that almost a week after I tried to place my own order, I still have no clue whether the order went through or not.

Dial back a couple of decades to the dawn of handhelds. The first Apple Newtons and Tandy/Casio Zoomers and such all had displays in the 5-inch range, and all were considered much too big and heavy. Which, of course, they were. So the Palms and Pocket PCs that followed weighed less and had much smaller screens. The standard screen size for an early Microsoft CE-powered Pocket PC, for example, was 3.8 inches, and that was quickly replaced by the 3.5-inch format. Phones weren't smart at that time, and when they did start getting some smarts screens got progressively smaller as the quest was for smaller and smaller phones.

This drive for the tiniest phones possible had an impact on industrial handhelds, with many switching to 3.5, 3.2 and even 2.8-inch screens, much too small for most serious work.

What happened when the iPhone and then Android phones came along was that phones stopped being just phones and became computers with very small screens. The screen size issue was first addressed with "apps," software specifically designed for tiny screens, and then, increasingly, with larger screens as more and more customers wanted to do "real" computer work on phones. The lure and tangible benefits of larger screens outweighed the inconvenience of having larger and larger phones, and so now we have phones with screens measuring almost six inches diagonally. Obviously, it's not a trend that can go on.

Interestingly, tablets and laptops followed different trajectories.

Laptops initially had very small screens, because the industry didn't know yet how to make larger LCDs. Once that technological hurdle was solved, screens became ever larger, with laptop screens growing to 17 inches. That meant size and bulk and weight and cost. Many attempts at smallish, lightweight "boutique" laptops failed due to cost, until netbooks arrived. Despite their tiny screens they sold in the tens of millions, primarily based on their very low cost. The low cost, unfortunately, also meant low performance, and so customers demanded more speed and larger screens. The industry complied, but once netbooks were large and powerful enough for real work, they cost and weighed more and customers abruptly stopped buying and abandoned the netbook market in favor of tablets or, more specifically, the iPad.

Interestingly, despite the decreasing costs of large screens, laptops all of a sudden became smaller again. Apple dropped their 17-inch models and is devoting all energy on super light 11 to 13 inch models. And it's rare to see a ruggedized laptop with a screen larger than 15 inches, with most being in the 12 to 14 inch range.

With tablets, customers don't seem to know what they want. Historically, the first attempt at tablets and pen computers back in the early 90s was all 8 to 10 inch screens, primarily because there weren't any larger displays available. When Microsoft reinvented the tablet with their Tablet PC initiative in 2001/2002, almost all available products were 10 inches, with the 12-inch Toshiba Portege 3500 being the sole deviation, making it look huge. None of the Tablet PC era tablets and convertibles were successful in the consumer markets, though, and that lack of interest didn't change until the iPad arrived.

The iPad set the standard for a "full-size" tablet with its 9.7-inch display that seemed neither too large nor too small, though many didn't know what to make of Apple sticking with the traditional 4:3 aspect ratio at a time where every display was going "wide." The aspect ratio issue hasn't been resolved as of yet, and screen sizes remain an issue. Initially being unable to get any marketshare from Apple, competitors tried another way by going smaller and cheaper, and suddenly everyone predicted 7-inch tablets were the sweet spot. And the 7-inch class was successful enough to get Apple to issue the iPad mini which, however, never sold nearly as well as the larger original.

With tablets getting smaller and smartphones larger, the industry came up with the awkward "phablet" moniker, with "phablets" being devices larger than phones but smaller than conventional tablets. But now that phone displays are 5.5 inches and larger, "phablets" have become "phones," and the survival of 7-inch tablets seems in doubt, unless people suddenly decide that a 7-inch "phone" is desirable, in which case the old 1992 EO440 would look prescient rather than absurd.

As is, no one knows what'll come next in terms of screen size. Even years after the first iPad, tablets with screens larger than 10-inches have not caught on. Phones as we know them simply cannot get much larger or else they won't fit into any pocket. And the size of traditional notebooks will always be linked to the size of a traditional keyboard, the operating system used, battery life, and the cost of it all.

It should be interesting to see how things develop. And that's not even getting into display resolutions where some phones now exceed the pixel count of giant HD TVs

Posted by conradb212 at 04:57 PM | Comments (0)

August 22, 2014

Why OneNote for Android with handwriting is important

A few days ago, the Office 365 and OneNote blogs at Microsoft announced OneNote for Android or the addition of handwriting and inking support for OneNote for Android, it wasn't quite clear (see here). While Microsoft OneNote isn't nearly as popular as Word and Excel, it's available as part of Microsoft Office, and supposedly over a billion people use Office. So there may be tens or even hundreds of million who use OneNote.

What is OneNote? It's sort of a free form doodle pad that can accommodate all sorts of data, from screen clippings to text, to audio, annotations, revisions, doodles, and so on. It can then all be sorted and shared with others. If that sounds a lot like the older Microsoft Journal, there are plenty of similarities.

Journal goes way, way back to the mid-1990s when a company name aha! introduced it as the aha! InkWriter. The idea behind InkWriter was that that you could arrange and edit handwritten text just like you can manipulate and edit text in a word processor, hence the name InkWriter ink processor. Handwriting and handwriting recognition were big back then, the central concept around which the initial pen computers of the early 1990s were built. As a result, InkWriter could also convert handwriting into text for later editing and polishing. To see how InkWriter worked, see the full manual here and Pen Computing Magazine's 1995 review here.

Anyway, Microsoft thought enough of it to buy InkWriter and make it available in Windows CE and Pocket PCs for many years. Renamed Journal it's still available in most PCs. In many ways, OneNote looks like a enterprise and business version of Journal, one that's more practical and more powerful. To this day, some users prefer Journal while others prefer OneNote.

The question may come up why Journal didn't become a major hit once tablets really caught on. Aha! designed InkWriter specifically for pen computers, i.e. tablets. There was, however, one major difference between the tablets of the 1990s and the modern media tablet: all 90's era pen computers had an active digitizer pen well suited for handwriting and drawing. Though no longer centered around handwriting recognition, Microsoft's Tablet PC initiative of 2001/2002 continued reliance on an active pen, enabling super-smooth inking and beautiful calligraphy.

That still isn't enough to make the Tablet PC a success and it took several more years until the iPad showed that effortless tapping and panning and pinching on a capacitive multi-touch screen was the way to go. The tablet floodgates opened, and the rest is history. Unfortunately, with all the success of modern-era touch screen tablets, one simply cannot annotate a document with fingers, one can't write with fingers, and there's a good reason why fingerprinting is for Kindergarteners and not for serious artists.

Samsung realized this when it equipped its rather successful Galaxy Note phablets with miniature Wacom-style pens. But since the Wacom active digitizer adds size and cost, it isn't common in modern tablets, and even Windows-based tablets usually refer to those broad-tipped and rather imprecise capacitive pens that are, for the most part, nearly useless.

But all that may be about to change. The latest advances in capacitive touch enable passive pens with tips as narrow as that of a pencil (see Advances in capacitive touch and passive capacitive pens). That will be a boon to Windows tablets that still have all those tiny check boxes, scrollers, and other interface elements designed decades ago for use with a mouse. And it will be a boon to apps like OneNote (and Journal!).

And with Android now dominating the tablet market as well, the availability of OneNote for Android with full handwriting/inking capabilities may mean that OneNote will find a much larger user base than it's ever had. That's because all of a sudden, the combination of OneNote and advanced passive capacitive pens will add a whole new dimensions to tablets, making them infinitely more suitable for content creation rather than just consumption.

Posted by conradb212 at 10:13 PM | Comments (0)

July 31, 2014

Android on the desktop!

Though Android dominates the smartphone market and has a very strong position in tablets, until now Google's OS platform was not available for desktops and conventional notebooks (the Chromebook with its limited offline functionality doesn't really count).

That has now changed with the new HP SlateBook, a full-function, quad-core Tegra 4-powered notebook with a 14-inch 1920 x 1080 pixel 10-point multi-touch screen, and running Android 4.3. The Slatebook weighs 3.7 pounds, offers up to 9 hours of battery life from its 32 Watt-Hour battery, has USB 3.0 and HDMI ports, 16GB of eMMC storage, a full-size keyboard, and starts at US$429.

The HP SlateBook is not a rugged product, but its arrival will almost certainly have an impact on the rugged computing industry. In fact, I believe that Android availability on desktops and laptops may change everything.

Think about it: While Microsoft continues to dominate the desktop and laptop, Redmond's presence in tablets is mostly limited to industrial and enterprise products. And while Windows CE and Windows Mobile/Embedded Handheld have managed to hang on in industrial handhelds for years longer than anyone expected, the handwriting is on the wall there, too. Between not having any upgrade or migration path between WinCE/WinMobile and Windows Phone, and Windows Phone itself just a distant also-ran in smartphones, we'd be very surprised to see any sort of dominant Microsoft presence on industrial handhelds in the future.

One of the primary reasons why WinCE/WinMobile managed to hang around for so long was the leverage argument: enterprise runs Microsoft and enterprise IT knows Microsoft, so using Microsoft on handhelds means easier integration and less development and training costs. But with all of those hundreds of millions of iOS and Android tablets and smartphones, even Microsoft-based enterprise quickly learned to work with them and integrate them, so the leverage argument isn't what it used to be.

On the desktop side, it will forever be unclear to me what drove Microsoft to craft Windows 8 as a dual-personality system with the largely useless (for now anyway) metro interface and a crippled legacy desktop that made no one happy. The 8.1 upgrade fixed a few of the worst ideas, but even as a face-saving retreat it was too little to address the basic structural problems. And whatever surfaces as Windows 9 will still have the unenviable task of seeking to perpetuate Microsoft's dominance of the desktop where most people still work, while watching the world go ever more mobile, on non-Microsoft platforms.

I think that by far Microsoft's strongest argument is that a mouse-operated windowed multi-tasking environment on a large display remains superior for creative and office work to finger-tapping on a tablet. I think the Office argument isn't nearly as strong. Sure, Office has a virtual monopoly in the classic office apps, but perhaps 90% of its functionality is available from numerous other sources. Even as a journalist, writer and publisher I rarely use more than the barest minimum of Office's myriad of functions.

And though at RuggedPCReview.com we're OS-agnostic, by far our platform of choice is the Mac. The painless OS upgrades alone would be enough to pick Apple, but what really clenches the deal is that when you get a brand-new Mac, you simply hook the Time Machine backup from your old one up to it and transfer everything to the new one, apps, data, settings and all. The next morning, whatever was on the old Mac is on the new one. No fuss at all. That is simply not possible with Microsoft.

Anyway, I often thought how nice it'd be to load Android on some of the numerous old Windows machines in the office that we can no longer use because the Microsoft OS on them is no longer supported or -- worse -- the system decided it was now pirated because we put in a new disk or processor. I thought how nice it'd be to have Android on the desktop or a good notebook for those times when you simply do need the pin-point precision of a mouse, the solidity of a real desktop keyboard, the comfort of a real desk and office chair, or the ability to see everything on a big 27-inch screen.

Or how nice it'd be to have exactly the same OS on such a big, comfortable, productive machine as well as on a handy tablet or a smartphone. I'd use that in a heartbeat.

There are, of course, questions. Microsoft itself has wrestled for decades with ways to provide a unified Windows experience on various platforms, and almost all those attempts failed. Apple didn't fall into the one-OS-for-all-devices trap and instead chose to optimize the user interface to the platform's physical characteristics. But that came at the cost of a rather weak connection between the Mac OS and iOS (why Apple doesn't get more criticism for iTunes and Cloud has always been beyond me). And even if Android were to emerge, full-blown, on laptops and desktops, we'd soon miss having multiple windows open. I mean, flipping between full-screen apps might have been acceptable back in the late 1980s, but going back to that would be a giant leap into the past.

Still, if full Android were available for desktops today, I'd drop everything I was doing to install it on several on the system in our office right this instance. And I am pretty certain that full Android availability on desktops and laptops would mean a massive renaissance for those currently beleaguered platforms.

So the release of HP's Android SlateBook just may be a milestone event.

Posted by conradb212 at 08:00 PM | Comments (0)

July 25, 2014

Advances in capacitive touch and passive capacitive pens

RuggedPCReview.com just finished reviewing the Panasonic Toughpad FZ-M1, and we ended up calling it a "milestone product" because of its novel and unique passive capacitive stylus that can greatly increase productivity because it allows for far greater precision when using the Windows desktop than touch alone or earlier capacitive pens.

This article describes the past and current situation of touch in rugged tablet computers, and why the technology employed by Panasonic in the FZ-M1 is so relevant.

Until the iPhone and then iPad popularized capacitive touch, tablets either used active or passive digitizers. Active digitizers used a special pen, with Wacom's electromagnetic system by far the most popular because its slender pen did not need a battery. Passive digitizers were mostly of the resistive variety, and worked with any stylus, piece of plastic or even a finger nail. Neither of these older digitizer technologies helped tablets make much of an impact beyond limited industrial and special market applications.

Active pens allowed for fairly precise operation, smooth inking, and since the cursor followed the tip of the pen even without the pen touching the display, users always knew where exactly the cursor was, just like with a mouse. But tablets relying on active pens became useless if the pen was misplaced or lost, necessitating the addition of a touch digitizer, which added cost and the need for "palm rejection," i.e. keeping the system from mistaking the reassure of one's palm when writing with the pen as input.

Passive resistive digitizers are still widely used, but they are a poor match for any user interface designed for operation with a mouse. That's because with them, the touch of a finger or the tip of a stylus is the equivalent of a left mouse click whether or not the point of contact was where the user intended. Resistive touch works well for specially designed applications with large buttons, but relying on it for precise operation, like using a word processor or spreadsheet, can be quite frustrating.

Apple didn't invent capacitive touch, but the iPhone certainly made a compelling case for effortlessly navigating an operating platform specially designed for the technology, and the iPad really drove the point home a few years later. Tapping, panning, pinching and zooming quickly becomes second nature. To the extent where NOT being able to pan and zoom on a display has become almost a deal-breaker.

While capacitive multi-touch certainly redefined the way tablets are used, and is largely responsible for the massive success of the tablet form factor, the technology isn't perfect, or at least not yet. It works very well in operating environments designed for it, such as iOS and Android, and with any app where tapping, panning, pinching and zooming gets the job done. But capacitive touch works far less well with software not originally designed for it, software that relies on the precision, small-fractions-of-an-inch movements of the mouse interface. That's why it remains clumsy to use word processors or spreadsheets or graphics apps with capacitive touch -- the precision required to operate them just isn't a good match for something as blunt as the tip of a finger. It's conceivable that a generation growing up with touch, kids who may have never used a mouse, won't see this as a shortcoming, but simply as the way touch works, and there'll undoubtedly be software that'll get the job done.

As is, we have a situation where hundreds of millions still rely on software that was NOT designed for touch (Windows), and so the need to continue to use that software conflicts with the desire to use state-of-the-art tablet hardware that uses capacitive multi-touch. So why not use a combination of capacitive multi-touch and an active pen, combining the best of both words? That can be done, and is being done. Samsung has sold tens of millions of Galaxy Note "phablets" that combine capacitive multi-touch with a Wacom active digitizer pen (making it by far the most successful pen computer ever sold). Is it a perfect solution? Not totally. The Wacom digitizer adds cost, what you see on the screen tends to lag behind when the pen moves quickly, and things get imprecise around the perimeter. And if the pen gets lost, pen functionality of the device goes with it.

Whatever the technical issues may be, we've now reached a point where customers pretty much expect capacitive multi-touch even for industrial and vertical market tablets. The tap / pan / pinch / zoom functionality of consumer tablets has just become too pervasive to ignore. So we've been seeing more and more rugged and semi-rugged tablets (as well as handhelds) using capacitive touch. That's no problem with Android-based tablets since Android was designed for capacitive touch but, unlike in the consumer market where iOS and Android dominate, but enterprise customers continue to demand Windows on their tablets. Which means a precise pen or stylus is pretty much mandatory.

Now what about capacitive pens? They have been around since the early days of the iPad, using a broad rubber tip on a stylus to provide operation a bit more precise than is possible with a finger. How much more precise? That depends. Even slender index finger tips measure more than 10 mm whereas those capacitive styli have tips in the 7-8 mm range. That seems improvement enough for several manufacturers of rugged tablets to include capacitive styli with their products. The tips of those styli are narrower than those of earlier versions, but still in the 5 mm range, and they still have soft, yielding tips. They work a bit better than older ones, but in no way as well as a mouse or an active pen. Not much that can be done, or can it?

It can.

When Panasonic sent us a Toughpad FZ-M1 rugged 7-inch Windows tablet for review it came with a full-size plastic pen. I assumed it was a passive stylus and that the little Panasonic tablet had a resistive touch interface in addition to its capacitive multi-touch. But Panasonic clearly described the stylus as "capacitive." In use, the pen with its hard 1.5mm tip was quite precise, far more so than any capacitive stylus I've ever used. It only required a very light touch to operate the classic Windows user interface. Panning was easily possible, and unlike resistive styli that really can't be used for fine artwork or handwriting (they are too jerky), this stylus produced smooth lines. It also never fell behind, one of my pet peeves with Wacom pens. The picture above shows the Panasonic pen (left) sitting next to a couple of older capacitive styli.

So what was it? I contacted Geoff Walker, RuggedPCReview's former technology editor and now Senior Touch Technologist at Intel, and asked his opinion. Geoff got back to me with a detailed email and said that the Toughpad FZ-M1 apparently was an example of the very latest in projected-capacitive touch controller capability. He explained that over the past couple of years the signal-to-noise ratio of touch controllers has been increased to a point where they can now recognize conductive objects no more than a millimeter thick. Whereas the controllers used in the original iPad may have signal-to-noise ratios of 10:1 or 20:1, the latest controllers can handle 3000:1. He suggested we try other objects on the Toughpad, and I found that a standard plastic stylus doesn't work, but a regular pencil does, as does a metallic stylus or even a metal paper clip. There's no pressure-sensitivity, which is something Wacom pens can do, but that's not really necessary for anything but art.

But there's more. The Toughpad FZ-M1 also comes with a touch screen mode setting utility.

Default mode is Pen / Touch, where both fingers and the pen are recognized.

There's pen-only operation, like when drawing or writing, or whenever you don't want your fingers to inadvertently trigger an action. Geoff said that probably works by measuring the area of touch and ignoring signals above a certain size.

Next is a glove touch setting. Geoff said this could be done by increasing the sensitivity of the touch controller so that it can recognize a finger even a brief distance away from the screen, as in the distance that the material of a glove adds to the finger's distance from the screen. That appears to be the case as the Toughpad FZ-M1 does work while wearing thin gloves. I thought it was the conductivity of the glove material, but apparently it's the thickness of the glove that matters.

Then there is a Touch (Water) setting. Capacitive touch and water generally don't get along because, as Geoff explained, water is so conductive that it affects the capacitance between two electrodes, the concept upon which projected capacitive touch is built. What can be done, Geoff said, is switch from the standard mutual capacitance mode to self-capacitance where the capacitance between one electrode and the ground is measured. The capacitive pen doesn't work in this mode because a fairly large touch area is required. I tried this mode with water from a faucet running onto the display. The display still recognized touch and to a very limited extent two-finger operation, but this mode is mostly limited to tapping.

What does all this mean? For one thing, it means that thick-tipped capacitive styli will soon be a thing of the past, replaced by far more precise passive capacitive pens with tips in the one-millimeter range, like the one available with the Panasonic Toughpad FZ-M1. Second, and I don't know if that's technically possible in Windows or not, but since the touch controller can sense the tip even when it's in close proximity of the screen, "hovering" or "cursor tracking" where a cursors follows the pen even without touching the screen (and thus triggering a left mouse click) may also be implemented. That's one useful thing active styli do that even advanced passive capacitive pens can't do (yet).

Would all of this still matter if it were not for the need to better support the legacy Windows desktop and its mouse-centric applications? It would. Tablets, hugely popular though they are, are mostly used for content consumption. The emergence of precise pens will open tablets to a much broader range of content creation. -- Conrad H. Blickenstorfer, July 2014

Posted by conradb212 at 04:14 PM | Comments (0)

June 30, 2014

Reporting from the road -- sort of

As editors of RuggedPCReview.com, we're probably better equipped than most to report anytime, anywhere, and under any conditions. After all, we not only have access to the latest and greatest mobile computing and communications gear, but much of that gear is designed to go just about anywhere.

The reality is a bit different, as we learned the hard way on a stretch of several weeks on the road and high sea. Testing underwater still and video camera equipment around some of the West Indies islands in the Caribbean, we wished for two things. One was that cell service were truly global, without all the hassles of having to switch SIM cards, sign up for international service, and fear the ever-present phone company gotchas that can result in huge charges on the next monthly bill. Another was that the GPS apps on our various devices were not so completely dependent on cell coverage to work. They all have many gigs of storage, so why not include a reliable and reasonably precise base map that always works, no matter where you are on the globe?

So while on the good ship Caribbean Explorer II, we were essentially cut off from the world. There are WiFi hotspots in ports, of course, but most are locked and the signal of those that are not was usually insufficient for anything other than frustration and perhaps a Facebook update or two.

A subsequent extended road trip to the great state of Tennessee promised better coverage. Between a MacBook Pro, two iPads, and two iPhones, keeping in touch and getting work done seemed doable. To some extent anyway. It's amazing what all can be done workwise on a tablet these days but, truth be told, it can take a lot longer to get even simple things done. That's where the MacBook was supposed to come in. The big 15-inch Apple laptop doesn't have WWLAN, though, and so we signed up for AT&T's 5GB US$50 hotspot service on one of the iPads.

Why paying for the iPad hotspot when there's WiFi coverage virtually everywhere these days? Because that WiFi coverage often isn't there when you need it most, or it doesn't work right, or sitting in a busy Starbucks just isn't the best place to get work done. The hotspot worked fine until, after just three days on the road, a message came up on the iPad saying we had used up the allotted 5GB and it was time to pay up for more. What? Apparently three days of moderate browsing and some work was enough to go through a full 5GB. I have no idea how that bit of activity could use so much data, but it did. That bodes ill for a future where increasingly everything is metered. Data usage is never going to go down, and metered data looks to be an incredible goldmine for the telcos and a massive pain for users.

In the end, what it all boiled down to was that, yes, we're living in a connected world, but no, you're not really connected all the time and you can't ever rely on having service. And, surprisingly, digital data coverage remains a frustratingly analog experience. There's coverage, maybe, somehow, but it's marginal and you may or may not get a connection, and if you do it may not be good or persistent enough to get actual work done. From that standpoint, it's a mobile world, but one that requires you to be stationary to actually make it work.

I often tell people that I can work anywhere, anytime, as long as I have internet access. But that's only true to some extent. Tablets and smartphones can only do so much. Even a full-function laptop is not likely to include every utility, file and tool that's on the desktop in the office. "The Cloud" truly cannot be relied on. And neither can data service.

All of that will change someday, and hopefully someday soon. As is, the connected world is a work in progress.

Posted by conradb212 at 03:49 PM | Comments (0)

April 23, 2014

Unpacking the Xplore iX104 XC6

So we get this box from Xplore Technologies, and it's pretty heavy. And it's a bit grimy. We figured we better open it outside. This is what happened:

Yes, Xplore sent us the brand-spaking new iX104 XC6 to make a point. Sod: It can handle grime and dirt. Sunglasses: You can use it in bright sunshine. Measuring tape: You can drop it from seven feet. Ice cube tray: it's freeze-proof. Inflatable pool ring: it can handle full immersion.

It also has a Haswell processor under the hood. And dual 128GB solid state disks in a RAID 0 arrangement. So equipped and still wet from the hose-down, the big, tough Xplore blasted to the fastest PassMark benchmark we ever recorded. Impressive.

Posted by conradb212 at 12:48 AM | Comments (0)

April 07, 2014

Durabook R8300 -- ghosts of GoBooks past

There are things in life where the outrage and pain just never seems to go away. For me that includes the infamous game 6 in the 2002 basketball playoffs where the NBA stole the championship from my Sacramento Kings, the forced demise of the Newton, a relationship issue or two, and then there is the way General Dynamics took over that intrepid small town Itronix computer company up in Spokane, Washington, just to then ruin it and shut it down.

There. Hundreds of people lost their job in Spokane when the corporate bigwigs at General Dynamics moved operations into some unfilled space in one of their buildings in Florida and then shuttered the Spokane facility where some of the most dedicated people in the industry had built rugged computers since the early 1990s.

But wait... not absolutely everything is lost. You see, like most US computer companies, Itronix had a close working relationship with a Taiwanese OEM/ODM, in this case Twinhead. While the precise detail of the Itronix/Twinhead relationship is only known to insiders, it's safe to assume that there was close interaction between the Itronix designers and engineers up in Spokane, and their Twinhead counterparts in Taipei. This was not a case of a US company just slapping their label on a machine designed and made in Taiwan. It was a close cooperation, and most of the machines sold by Itronix were exclusives, meaning that no one else sold them under their brand and label.

An example was the Itronix flagship GoBook III, which was replaced by the GoBook XRW, and then, once General Dynamics had inexplicably discarded the hard-earned brand equity in the "GoBook" name after they took over, the GD8000 (shown in picture) and its tech refresh, the GD8200. That machine and Getac's fully rugged notebooks were the Panasonic Toughbooks' primary competition in the rugged notebook market. Precise sales figures are hard to come by in this industry, but by most accounts Itronix had about a 12% marketshare.

It's been over a year since Itronix was shuttered, but what should suddenly arrive but the GammaTech Durabook R8300. It immediately seemed familiar to me, and a closer look revealed that, yes, it's an updated version of the GD8200. The name alone gives a clue as GammaTech usually names its devices with a combination of letters and numbers centering around display size, like CA10 or S15H. Itronix named their machines by series, and so it was the GD4000, GD6000, and GD8000. The GD8200 refresh may have signified its move to 2nd generation Intel Core processors, in which case the R8300 name could be a combination of R for rugged, and 8300 paying both homage to the machine's origin and the switch to 3rd generation Ivy Bridge processors.

Be that as it may, its origin and history instantly qualifies the Durabook R8300 as a serious contender in the rugged notebook market. Yes, a 4th gen Intel chip would have been nice, but keeping up with Intel's ever-changing generations isn't the highest priority in a class of machines where longevity and backward compatibility mean more than the very latest specs. As is, the R8300, having the same design and layout as the GD8000 and GD8200, will most likely work with all existing GD-Itronix docks and peripherals, and anyone seeking to replace aging 8000 Series Itronix notebooks should be thrilled.

So at least some part of the longstanding, fruitful cooperation between Twinhead and Itronix lives on. The GD8200 was a terrific workhorse of a machine, and with the updated tech specs, the Durabook R8300 is certain to be even better.

Posted by conradb212 at 06:41 PM | Comments (0)

March 19, 2014

Getac's latest rugged convertible replaces the V8 with a turbo-4

I love cars and often use automotive analogies to describe situations. One came to mind recently as we evaluated a particularly interesting new rugged mobile computer. So here goes:

With gas mileage becoming ever more important in cars and trucks, the automotive industry has been pulling all stops to come up with more fuel efficient vehicles. One way to boost fuel efficiency is to reduce the weight of the vehicle. Another is to use smaller turbocharged motors to burn less fuel while providing the same performance.

That came to mind when we did a full hands-on test with the new Getac V110 convertible notebook computer. It's very significantly lighter than its two predecessor models. And it uses a processor with less than half the thermal design power than those predecessor models. Yet, it offers close to the same performance and has even longer battery life.

Here's how this likely came about. While Getac's early convertible notebooks had used low-voltage economical processors, subsequent models became more powerful and required larger batteries. That eventually pushed their weight in the six to seven pound range, quite a bit for devices meant to be carried around and occasionally used as tablets.

So Getac's engineers went back to the drawing board and designed a new convertible notebook, using the latest space and weight-saving technologies to cut an ounce here and a fraction of a inch there. Most importantly, they switched to low-voltage versions of Intel's 4th generation Core processors that include new and superior power saving features. That allowed them to replace the massive battery of the older models with two small batteries, each no larger than an iPhone. The resulting design, the V110, weighs over two pounds less than the last V200 we tested, and a good pound and a half less than the old V100. The V110 is also much thinner.

So as for the automotive analogy, Getac replaced a hulking V8-powered SUV with a much svelter, lighter one with a turbo-4.

But does that work in the real world? For the most part it does. While peak processor performance of the V110 is close to that of the standard-voltage predecessor models, idle power draw is less than half. What that means is that in many routine applications, the re-engineered and much lighter V110 will get the job done on what amounts to half a tank compared to the predecessor models. There's just one area where the automotive analogy breaks down: whereas automotive turbos can suck a good amount of fuel under full load, the V110 remained quite economical even when pushed to the limits.

Posted by conradb212 at 03:55 PM | Comments (0)

February 25, 2014

Samsung Galaxy S5 -- raising the bar

On February 24, 2014, Samsung showed their new flagship smartphone, the Galaxy S5. It's relevant because it'll sell by the many millions, and it'll be the primary opponent to Apple's current 5s and whatever Apple comes up with next. But the Galaxy S5 is also relevant because the technology it includes and provides will become the new norm. It'll be what people expect in a handheld. And that affects the rugged industrial and vertical markets, because what the Galaxy S5 (and soon many other consumer smartphones) offers is the benchmark, and people won't accept considerably less on the job.

The Galaxy S5 is essentially a little tablet with phone functionality built in. It measures 5.6 x 2.9 x 0.32 inches and weighs five ounces. That's a lot larger than those tiny phones of a few years ago when the measure of phone sophistication and progress was how small a phone could be made. The S5 is really not pocketable anymore, but because it's so thin it still weighs less than half of what some of the original PDAs weighed.

The Galaxy S5 screen measures 5.1 inches diagonally, and offers full 1920 x 1080 pixel resolution. That's the same as the current generation of giant flatscreen TVs, making the Galaxy's display razor-sharp, and essentially making a mockery of any claim to "high definition" that a current TV may have. And the screen uses OLED technology, which means no backlight and a perfect viewing angle.

There are, of course, two cameras. The frontal one has 2mp resolution, which in theory means that vidcam conversations can go on in full 1080p video. The other one has 16mp, not as almost absurdly extreme (42mp) as the one Nokia offers in its flagship, but much higher than Apple, and the same you'd expect from a good dedicated camera. And it can even record in 4k video. If it's in full 30 frames per second, that's something even the vaunted GoPro Black Edition can't, and it'll mean that the vast majority of hyper-sharp 4k video may come from smartphones, and not from some sort of disc player (no 4k standard exists yet), not from streaming video (most internet connections still can't handle that), and not from dedicated cameras (there are hardly any for now).

Memory, no surprises there. But I should mention once again that the 16 or 32GB of storage built into smartphones such as the new Galaxy S5 more than likely contributed to the impending demise of lower-end dedicated digital cameras which, for some unfathomable reason, still only have 32MB (megabyte, not gigabyte) on board, one thousand times less than a smartphone.

Communication? The fastest available WiFi (802.11ac) and the latest rev of Bluetooth (4.0) and also NFC, near field communication, sort of a version of RFID. Then there's a fingerprint scanner, and—also new—a pulse scanner than can include heart beat into all sorts of apps and systems. I've long believed that sensors are a field of advance and new capabilities, providing handhelds ever more information about their environment, and thus making them ever more useful.

And it's all powered by a 2.5GHz quad-core Snapdragon 800 processor that makes older processor technology look like cast iron flathead six designs from the 40s compared to a computer-controlled modern turbo with variable valve timing.

Setting the bar even higher, this new Galaxy phone carries IP67 sealing, which means it's totally dust-proof, and also waterproof to the extent where it can survive full immersion into water down to about three feet. IP67 has long been the holy grail of rugged computer sealing, and now Samsung offers it in a consumer smartphone that's an eighth of an inch thick.

I think it's fair to say that the bar has been raised. Dedicated industrial and professional tools will always do this thing better or that, have better dedicated components such as scanners, and they can, overall, be much tougher than any eighth-inch smartphone can ever be. But this level of technology so freely available to millions is simply certain to have an impact on expectations.

Posted by conradb212 at 03:48 PM | Comments (0)

February 18, 2014

Android parlor trick

Just a brief entry here....

Up to Android "Jelly Bean," i.e. versions 4.1.x through versions 4.3.x, one of the cool things about Android was the (relative) ease with which one could do screen grabs. Those, of course, are essential to product reviewers. And so it was good to know that all one had to do was connect the Android device to a PC or Mac, fire up the Android SDK, click Applications > Development > USB debugging on, and grab those screens.

That's what I wanted to do when I recently upgraded an Android tablet from "Ice Cream Sandwich" to "Jelly Bean." Unfortunately, Applications > Development > USB debugging was no longer there, and there seemed nothing else that would allow access to the debugging mode. Google to the rescue.

Well, turns out that getting into Android debugging mode now involves a secret handshake. You go to About Tablet, then click on Build SEVEN TIMES. That enables the Developer options menu, where you need to click on USB Debugging. That's about as non-obvious as it gets, and probably reflects Google's efforts to keep all those hundreds of millions of Android users from hurting themselves by accidentally disabling their device.

That probably makes sense. I still believe one of the reasons why Linux never really made it big as an OS for the masses is because its creators insisted to leave the arcane technical underbelly more or less visible to all. As Android matures, Google can't allow that to happen. Just like "View Page Source" has now vanished from easy view on all major browsers.

But it's a good party trick. Next time you see a techie at a cocktail party or else, test him or her by innocently asking "How do I get into debug mode under Jelly Bean??"

Posted by conradb212 at 05:54 PM | Comments (0)

January 12, 2014

More 4k video contemplations

All of a sudden, everyone is talking about 4k video. Also known as Ultra-HD video, four times the resolution of the 1920 x 1080 pixel 1080p standard, 4k was everywhere at the Consumer Electronics Show in Las Vegas. Now, obviously, 4k video isn't the most important thing on rugged mobile computer manufacturers' minds, but 4k video is nonetheless a sign of changing times. And with some consumer smartphones already offering full 1080p resolution on their small screens, and consumer tablets going well beyond that, it's only a matter of time until rugged and industrial market customers, too, will demand much higher resolution in their professional computing gear. So keeping track of what's happening out there with ultra-high resolution makes sense.

As I stated in an earlier essay on high resolution (Thoughts about display resolutions, December 2013), I recently purchased a 39-inch Seiki 4k flatscreen display that can be used as a monitor or as a TV. It was an impulse buy, and I justified the (remarkably reasonable) price by deciding the Seiki was a research investment that would help us here at RuggedPCReview.com learn more about how 4k video worked, what was possible, and what wasn't.

On the surface, 4k video makes an awful lot of sense. 1080p HD video was just perfect six or seven years ago for the emerging flood of 40-50 inch flatscreens. But flatscreens have grown since then, and 65, 70 and even 80 inches are now the norm. As you can imagine, the same old 1080p video doesn't look nearly as sharp on screens with two, three, or four times the real estate, or more. So doubling the resolution in both directions makes perfect sense.

And it's a great opportunity to infuse new life and excitement into the flatscreen TV market. Three years ago, everyone offered 3D TVs. An amazing number were sold, amazing given that there was virtually no 3D content. And amazing considering one had to wear those annoying 3D glasses. So 3D quietly became just a built-in feature in most new TVs, but it's no longer a selling point. 4k video, on the other hand, IS now a selling point. And it'll become even bigger as time moves on.

The problem, though, is the same as it was with HD, initially, and then with 3D: no content. There is no native 4k video standard for storage or players. There are no 4k players and only a very few recorders. none of which is mature at this point.

So we did some testing to see what's possible, and what's not possible. The goal was to see whether it's actually possible to get 4k video without breaking the bank. So here's what we did, and how it worked out so far.

What can you do today with a 4k display such as our 39-inch Seiki? Well, you can watch regular HD TV on it via your satellite or cable setup. You can hook it up to video game consoles. You can connect it to streaming video gizmos like Apple TV, Google Chromecast, or the various Roku type of devices. Or you can connect a tablet, notebook or desktop to it. Sadly, almost none of these support resolutions higher than 1080p. Which means that on a 4k display you get video that may or may not look better than on a regular 1080p display. May or may not, because some devices do a decent job at "upscaling" the lower res. For the most part, though, displaying 1080p content on a 4k screen is a bit like running early iOS apps in "2X" mode, where each pixel was doubled in each direction. That's not impressive.

What about hooking up a notebook or desktop to the 4k screen? Well, none of the various computers around our offices supported more than 1080p. And the one Windows desktop I use most often for testing is actually a rather old HP with just a 2.2GHz Core 2 Duo processor. That's still good enough to run Windows 8.1 at a decent clip, but the video from the HP's Asus motherboard maxed out at 1680 x 1050 pixel. So it was time to look around for a video card that could actually drive 4k video.

That, my friends, was a sobering experience for me as I realized how little I knew of current video card technology. Sure, we cover whatever Intel bakes into its latest generation of Core processors, and have a degree of familiarity with some of the discrete graphics subsystems available for various rugged notebooks. But beyond that there's an incredibly complex world of dedicated graphics chips, interface standards, different connectors, as well as an endless array of very specialized graphics features and standards they may or may not support.

I am, I must admit, a bit of a gamer and love spending some relaxing hours playing video games on the Sony and Microsoft consoles. A particular favorite of mine is Skyrim, and so I bought a copy for the PC, to see what it would look like on the Seiki 4k screen. Well, initially I couldn't get the game to work at all on the old HP desktop as its motherboard video didn't support one feature or another. Now it was definitely time to invest in a graphics card.

Several hours of Googling and reading up on things yielded only a vague idea on what might be a feasible solution to our video issue. You can, you see, easily pay more for a sophisticated video card than you pay for an entire computer. And some of those cards need two expansion slots, have large fans, and require massive power supplies to even run in a system. That was out. Was it even possible to get a decent video card that would actually drive 4k video AND work in an old PC like our HP?

The next pitfall was that on Amazon and eBay you really never know if something is the latest technology, or old stuff from a few years ago. Vendors happily peddle old stuff at the full old list price and it's all too easy to get sucked in if you are not totally up to speed. So always check the date of the oldest review.

What eventually worked best was checking some of the tech sites for recent new video chip intros. nVidia and AMD have practically the entire market, and are locked in fierce competition. The actual cards may come from other sources, but they will use nVidia or AMD chips. a bit more research showed that AMD had recently introduced Radeon R7 chips for reasonably priced graphics cards, and those cards actually appeared to support 4k video. And use the PCI Express x16 slot that my old desktop had. I did truly not know if that was the same connector and standard (almost every new Intel chip uses a different socket), but it looked the same, and so I ordered a Gigabyte Radeon R7 250 card with a gigabyte of GDDR5 memory on Amazon for amazingly affordable price of US$89, and no-cost 2-day shipping with Amazon Prime.

The card promptly arrived. And it fit into the x16 slot in the old HP. And the HP booted right up, recognized the card, installed the AMD drivers from the supplied DVD, and did nice 1680 x 1050 video via the card's DVI port on the vintage 22-inch HP flatscreen that had come with the PC. Skyrim now ran just fine.

So it was time to see if the Radeon card would play ball with the Seiki 4k screen. Using a standard HDMI cable, I connected the old HP to the Seiki and, bingo, 4k video came right up. 3840 x 2160 pixel. Wow. It worked.

Windows, of course, even Windows 8.1, isn't exactly a champion in adapting to different screen resolutions, and so it took some messing around with control panels and settings to get the fonts and icons looking reasonably good. And while I have been using a 27-inch iMac for years as my main workstation, 39-inch seems weirdly large. You'd watch a TV this size from a good distance, but for PC work, you're real close and it doesn't feel quite right.

So now that we had an actual 4k video connection to a 4k display, it was time to look for some 4k content. YouTube has some (search "4k video demos"), and so we tried that. The problem there was that running it requires substantial bandwidth, and our solution—a LAN cable connected to a power line connector to run the signals through the building wiring, and then to our AT&T broadband—apparently wasn't up to snuff. So we saw some impressive demo video, very sharp, but more often than not it stopped or bogged down to very low frame rates. So bandwidth will be an issue.

We also perused pictures in full resolution. 4k is over 8 megapixel in camera speak, and so you can view 8mp pictures in full resolution. The result, while quite stunning from a distance, is actually a little disappointing from up close. The JPEG compression that is usually hardly noticeable on smaller screens is obvious, and then there's the fact that even 4k resolution on a 39-inch screen isn't all that much. It's really just in the pixel density range of an older XGA (1024 x 768) 10-inch tablet, and those have long been left behind by "retina" and other much higher resolution screens.

Then we cranked up the Skyrim game and ran that full-screen. It looked rather spectacular, though probably ran in 1080p mode because full 4k would require a good deal more video RAM and modified textures to really be in full 4k resolution. It did look good, but it also bogged down.

After an hour of playing with the 4k video setup I began feeling a bit nauseous and had to stop. The reason, apart from not being used to the large screen from so close up, was almost certainly a serious limitation of the 39-inch Seiki—it runs 4k video at a refresh rate of only 30 Hertz. That is very low. Most modern computer displays run at 65-85 Hertz. You don't actually see flickering, but I am convinced the brain has to do some extra work to make sense of such a low frame rate. And that can make you nauseous.

One of the original triggers of my impulse decision to get the 4k Seiki screen was to watch 4k video from our GoPro 3 Black cameras. The GoPro's 4k, unfortunately, is really also just a technology demonstration for now, as it runs at just 15 frames per second. Normal video viewing is at 30 fps, and games and other video may run at much higher frame rates yet. So until we can view 4k video at 30 fps and more, it's just an experiment.

So that's where we stand with 4k video. There's a vast discrepancy between the marking rhetoric that's pushing 4k now, and the fact that there's almost no content. And significant technical barriers in terms of frame rates, bandwidth, standards, and existing hardware and software that just can't handle it. It's a bit like Nokia trying two tell people you need a 41mp camera in a phone when the phone itself can display less than a single megapixel, and it would take more than four 4k screens in a matrix to display a single picture in full resolution.

In summary, I did not go into great detail in my investigations, and there will be many out there who have much more detailed knowledge of all those standards and display technologies. But we did take a common sense look at what 4k can and cannot offer today. The following about describes the situation:

- 4k displays will soon be common. Dell already offers a range of affordable models (like this one).
- 4k video support is rapidly becoming available as well, and 4k video cards starts at well under $100.
- Some Intel Haswell chips offer integrated 4k video support.
- There's virtually no 4k content available.
- 4k uses more bandwidth than many current linkups can supply.
- The 4k experience is in its infancy, with insufficient refresh rates and frame rates.
- BUT it's clearly the future.
- 4k rugged displays and signage systems are rapidly becoming available.
- Do spend the time learning what it all means, and how it fits together.

Posted by conradb212 at 01:02 AM | Comments (0)

December 26, 2013

Does your Pentium have an Atom engine?

There was a time in the very distant computing past where, when buying a computer, the sole decision you needed to make was whether to use the Intel 386/33 or save a few bucks and get the slightly slower 386/25. Today, if you use Intel's handy ARK app that lists every product available from Intel, there's a staggering 1,874 different processors listed. That includes processors targeted at desktop, server, mobile and embedded computers, but even if you leave out servers and desktops, there's still the choice of 949 processors for mobile and embedded applications. Not all of them are state-of-the-art, but even chips designated as "legacy" or "previous generation" are still in widespread use, and available in products still being sold.

The mind-blowing number of Intel processors available brings up the question of how many different Intel chips the world really needs. As of the end of 2013, Apple has sold about 700 million iPhones, iPads and iPhone Touch devices that made do with a mere dozen "A-Series" chips. Not too long ago, tens of millions of netbooks all used the same Intel Atom chip (the N270). So why does Intel make so many different chips? Even though many are based on the same microarchitectures, it can't be simple or cost-efficient to offer THAT wide a product lineup.

On the customer side, this proliferation serves no real purpose. End users make almost all purchasing decisions based on price. Figure in desired screen and disk sizes, and whether there's an Atom, Celeron, Pentium, Core i3, Core i5, or Core i7 chip is inside is confusing at best. For hardware manufacturers it's worse, as they must deal with very rapid product cycles, with customers both demanding legacy support AND availability of the latest Intel products. THEY must explain why this year's Intel crop is so much better than last year's which was so much better than what Intel offered the year before. Or which of a dozen of roughly identical Intel chips makes the most sense.

As is, Intel has managed to bewilder just about anyone with their baffling proliferation of processors, and without the benefit of having established true brand identities. What Intel might have had in mind was kind of a "good, better, best" thing with their Core i3, i5 and i7 processors, where i3 was bare-bones, i5 added Turbo mode and some goodies, and i7 was top-of-the-line. But that never really worked, and the disastrous idea to then come up with a generation-based system that automatically made last year's "generation" obsolete only adds to the confusion. And let's not even get into Intel "code names."

Atom processors were supposed to provide a less expensive alternative to the increasingly pricey Core chips—increasingly pricey at a time where the overall cost of computers became ever lower. Unfortunately, Intel took the same approach with Atom as Microsoft had taken with Windows CE—keep the line so wingclipped and unattractive that it would not threaten sales of the far more profitable Windows proper and mainline Intel chips. At RuggedPCReview we deeply feel for the many vertical and industrial market hardware and systems manufacturers who drank Intel's Atom CoolAid just to see those Atom processors underperform and in quick need of replacement with whatever Intel cooked up next.

But that unfortunate duality between attractively priced but mostly inadequate entry level Atom chips and the much more lucrative mainline Core chips wasn't, and isn't, all. Much to almost everyone's surprise, the Celeron and Pentium brands also continued to be used. Pentium goes back to circa 1990 when Intel needed a trademarkable term for its new processors, having gotten tired of everyone else also making "386" and "486" processors. "Celeron" came about a few years later, around 1998, when Intel realized it was losing the lower end market to generic x86 chipmakers. So along came Celeron, mostly wingclipped versions of Pentiums.

Confusingly, the Celerons and Pentiums continued to hang around when Intel introduced its "Core" processors. The message then became that Pentiums and Celerons were for those who wouldn't spring for a real Core Duo or Core 2 Duo processor. Even more curiously, Pentiums and Celerons still continued when the first generation of the "modern era" Core processors arrived, and then the second, third and forth generation. Study of spec sheets suggested that some of those Pentiums and Celerons were what one might have called Core i1 and i2 chips, solutions for when costs really needed to be contained to the max. In some cases it seemed that Intel secretly continued its long-standing tradition of simply turning off features that were really already part of those dies and chips. A weird outgrowth of that strategy were the last-ditch life support efforts of the dying netbook market to answer the calls for better performance by replacing Atom processors in favor of Celerons that were really slightly throttled Core i3 processors. That actually worked (we have one of those final netbooks in the RuggedPCReview office, an Acer Aspire One 756, and it's a very good performer), but it was too little, too late for netbooks, especially against the incoming tide of tablets.

Given that the choice of mass storage, the quality of drivers, keeping one's computer clean of performance-zapping gunk and, most of all, the speed of one's internet connection (what with all the back and forth with The Cloud) seems to have a far greater impact on perceived system performance than whatever Intel chip sits in the machine, it's more than curious that Celeron and Pentium are not only hanging around, but have even been given yet another lease on life, and this one more confusing than ever.

That's because Intel's latest Atom chips, depending on what they are targeted at, may now also be called Celerons and Pentiums. It's true. "Bay Trail" chips with the new Atom Silvermont micro architecture will be sold under the Atom, Celeron and Pentium brand names, depending on markets and chip configurations. Pontiac once took a heavy hit when the public discovered that some of their cars had Chevy engines in them. Pontiac is long gone now, and General Motors has ditched other brands as well, realizing that confusing consumers with too many choices made little sense. Even GM, however, didn't have anywhere near the dominance of their market as Intel has of its market.

Where will it all lead? No one knows. Intel still enjoys record profits, and other than the growing competition from ARM there seems little reason to change. On the other hand, if the current product strategy continues, four years from now we may have 8th generation Core processors and 4,000 different Intel chips, which cannot possible be feasible. And we really feel for the rugged hardware companies we cover. They are practically forced to use all those chips, even though everyone knows that some are inadequate and most will quickly be replaced.

PS: Interestingly, I do all of my production work at RuggedPCReview.com on an Apple iMac27 powered by a lowly Core 2 Duo processor. It's plenty fast enough to handle extreme workloads and extensive multitasking.

Posted by conradb212 at 06:35 PM | Comments (0)

December 13, 2013

Michael Dell's keynote at Dell World 2013: reaching for the cloud

One big problem with being a public company is that every three months it's imperative not to disappoint analysts and investors. Dell won't have to worry about that anymore because it returned to being a private company. That means Dell can now take the longer look, pursue the bigger picture, and no longer suffer from the infliction of short term thinking, as Michael Dell so eloquently put it in his keynote address at the 2013 Dell World conference in Austin, Texas.

And that was the core of Michael Dell's message, that as a private company Dell now has the freedom to make the bold moves that are necessary, investing in emerging markets, and take a long-term view of their investments. Dell said he senses a new vibe of excitement at the company that bears his name, and that he feels like he is part of the world's largest startup.

He did, of course, also mention that sales were up in the double digits, that 98% of the Fortune 500 are Dell customers, that Dell has established a new Research Division and also the Dell Venture Fund. Dell reminisced how he had been taking an IBM PC apart in his dorm room, found that many of the components weren't actually by IBM but steeply marked-up 3rd party components, and how he felt he could make that technology available cheaper and better. He expounded on a recurring theme in his keynote, that the proliferation of computer technology between roughly 1985 and 2005 also saw poverty in the world cut by half.

Dell showed an Apple-esque commercial that will run in 2014 and plays homage to all the little companies that started with Dell, including the one in dorm room #2714 (Dell itself, founded on a thousand bucks). Nicely done and charming. He spoke of how the future is the move beyond products and to end-to-end scalable solutions. He spoke of $13 billion invested in research, highlighted the Dell PowerEdge servers that are in the top two in the world, demonstrated, right on stage, how integrated server, storage and network functionality in fluid cache technology clocked in at over 5 million iops (input output operations per second).

Dell spoke of their new mantra, to transform, connect, inform, and protect. Transform as in modernize, migrate and transition to the future. Connect as in connecting all the various computing devices out there, including the Dell Wyse virtual clients, because, still and for a good time to come, "the PC, for a lot of organizations, is how business is done." Inform, as in turning data into useful, productivity enhancing results, making companies run better with data analytics. And protect as in offering next gen firewalls and connected security to keep data safe and ward off attacks before they even happen.

Dell also reminded that the company has been the leader in displays for over a decade, and touched on 4k2k video resolution that is available from Dell now, another example of Dell making disruptive technology available at accessible prices.

Dell then introduced Elon Musk, who had driven in in a red Tesla Model S, his company's groundbreaking electric car. Along came David Kirkpatrick who took on the job of engaging Musk interview style. Musk, of course, also pioneered PayPal, and, in addition to Tesla, runs SpaceX, sending rockets into space. Musk was there, however, not so much to discuss technology, but to illustrate the impact of pursuing the big picture, big ideas, things that simply need to get done. As if on cue, Dell rejoined the conversation, congratulating Musk and bemoaning the infliction of short term thinking that hamstrings progress and big ideas and bold bets.

Musk said this must be the best time in human history, where "information equality" breaks down barriers, making the world a safer, better place, meshing with Dell's clear belief that technology is a boon for mankind. Today, Musk said, anyone with internet access has more information available than the very President of the United States had just 30 short years ago. Musk also expressed regret over a pessimistic, negatively biased media that always seems to seek the negative.

The address concluded with an observation by Kirkpatrick about Tesla cars' constant connection with the Tesla's computers, and how that information feedback and back and forth is used to make the cars better. Just as, Dell said, his company's approach with customers, a constant back and forth, and constant feedback in their quest to improve and innovate.

This 75 minute keynote was a bold move with a big picture and a big message, with Dell, Musk and Kirkpatrick almost like actors in a play. Quite a task to pull this off, but the message was loud and clear and finely tuned: Dell is now free to pursue big ideas. Like Elon Musk with his electric car and rockets shooting into space, Dell can now reach for the cloud(s), and beyond.

Posted by conradb212 at 02:57 PM | Comments (0)

December 05, 2013

Thoughts about display resolutions

The resolution of computer displays is an interesting thing. There are now handhelds with the same number of pixels as large flatscreen TVs, Apple claims its "retina" displays are so sharp that the human eye can no longer see individual pixels, and the very term "high definition" is in the process of being redefined. Let's see what happened with display resolution and where things are headed, both in handhelds and in larger systems, and what 4k2k is all about.

Color monitors more or less started with the original IBM PC's 320 x 240 pixel resolution. In 1984, the monochrome Hercules video card provided 720 x 350 pixel for IBM compatibles. For a long time IBM's 640 x 480 VGA, introduced 1987 with their PS/2 computers, was considered a high resolution standard for the desktop and for notebooks. Then came 800 x 600 pixel SVGA and 1024 x 768 pixel XGA, and those two hung around for a good decade and a half in everything from desktops to notebooks to Tablet PCs. Occasionally there were higher resolutions or displays with aspect ratios different from the 4:3 format that'd been used since the first IBM PC, but those often suffered from lack of driver and software support, and so pretty much everyone stayed with the mainstream formats.

It was really HDTV that drove the next advance in computer displays. During a 1998 factory tour at Sharp in Japan I had my first experience with a wide-format TV. It looked rather odd to me in my hotel room, sort of too wide and not tall enough and not really a TV, but, of course, that turned out where things were going. In the US, it would take a few more years until the advent of HDTV brought on wide-format, and terms such as 720p and 1080p entered our tech vocabulary. For a good while, smaller and less expensive flatscreen TVs used the 1280 x 720, or "720p," format, while the larger and higher end models used full 1920 x 1080 pixel "1080p" resolution. That meant a wide 16:9 aspect ratio.

Desktop and notebook displays quickly followed suit. The venerable 1024 x 768 XGA became 1366 x 768 WXGA and full 1920 x 1080 pixel displays became fairly common as well, albeit more on the desktop than in mobile systems. Professional desktops such as the 27-inch Apple iMac adopted 2560 x 1440 resolution. On the PC side of things standards proliferated in various aspect ratios, resulting in unwieldy standards terminology such as WSXGA+ (1680 x 1050) or WQXGA (2560 x 1600).

An interesting thing happened. Whereas in the past, TVs and computer displays had very different ways to measure resolution, they're now more and more the same, what with flatscreen TVs really being nothing more than very large monitors. The 1920 x 1080 pixel 1080p format, in particular, is everywhere. Amazingly, that's becoming a bit of a problem.

Why? Because as TVs become ever larger, the same old 1080p resolution no longer looks quite as high definition as it did on smaller screens. Put a 42-inch and a 70-inch TV next to each other and you can plainly see the degradation in sharpness. The situation isn't as drastic on notebooks because, after growing ever larger for many years, notebook displays have leveled off in size, and have even shrunk again (Apple no longer makes the 17-inch MacBook Pro, for example). Desktop monitors, however, keep getting larger (I use two 27-inch monitors side-by-side), and that means even "high definition" 1920 x 1080 doesn't look so good anymore, at least not for office-type work with lots of small text. While I was excited getting a reasonably priced HP Pavilion 27-inch 1080p IPS monitor for the PC sitting next to my Mac, I find it almost unusable for detail work because the resolution just isn't high enough to cleanly display small text.

While current resolution standards are running out of steam on larger displays, the situation is quite different in the small screens used on handhelds and smartphones. There we have a somewhat baffling dichotomy where many industrial handhelds still use the same low-res 320 x 240 QVGA format that's been used since the dawn of (computer) time, whereas the latest smartphones have long since moved to 1280 x 800 and even full 1920 x 1080 resolution. Tablets, likewise, pack a lot of pixels onto the rather small 7-inch and 10-inch formats that make up the great majority of the tablet market. Apple led the way with the "retina" 2048 x 1536 pixel resolution on the 3rd generation iPad. That's like a 2x2 matrix of 1024 x 768 pixel XGA displays all in one small 9.7-inch screen. Trumping even that, the latest Kindle Fire HDX tablet packs an astounding 2560 x 1600 pixel onto its 8.9-inch screen. So for now, smartphones and tablets are at the front of the high-resolution revolution.

Somehow, we quickly get used to higher resolution. Older displays that we remember looking great now look coarse and pixellated. With technology you can never go back. The state-of-the-art almost instantly becomes the acceptable minimum. Whereas our eyes used to expect a degree of blurriness and the ability to see individual pixels on a screen, that's less and less acceptable as time goes on. And it really does not make much sense to declare 1080 as "high definition" when by now that resolution is used on anything between a smartphone and an 80-inch TV.

Fortunately, the next thing is on the horizon for TVs and monitors, and it's called 2k4k, which stands for 2,000 x 4,000 pixel. 2160p would be a better name for this likely standard as it is simply a 2x2 matrix of current four current 1080p resolution displays, or 3,840 x 2,160 pixel. That still only means that a giant new 80-inch screen will have no more than the pixel density of a 1080p 40-inch display, but it's certainly a logical next step.

I had all of this on my mind, when I received an email offer from one of my favorite electronics places. It was for a 39-inch Seiki TV/monitor with 4k resolution for a very attractive price and free shipping. I impulsed-ordered it on the spot, telling myself that I need to know where the 4k technology stands and what, at this point, it can and cannot do. And this would finally be a monitor where I could watch the 4k video my GoPro 3 Black Edition can produce.

So I got the Seiki and it's a great deal and bargain. Or it would be if I actually had anything that could drive a 2k4k display in its native mode, which I don't. In fact, at this point there is virtually nothing that can drive a 4k display in full 4k 3840 x 2160 pixel resolution. Yes, the 4k videos from my GoPro 3 Black Edition would probably look great on it, but that would require me to copy the video footage to a PC that can drive an external 4k monitor, which virtually no stock PCs can do today. DVD or Blu-Ray players certainly can't display in 2k4k, and even brand-new gear like the Sony PS4 game console can't. I COULD, of course, get a low-end 4k-capable video card from AMD, but I am not sure any of the PCs in the RuggedPCReview office could actually even accommodate such a card.

The unfortunate truth is that as of late 2013, there's very little gear that can send a true 4K video signal to a 4K TV or monitor. Which means that most content will be viewed in up-sampled mode, which may or may not look great. This will undoubtedly become a marketing issue in the consumer space—there will be great interest and great expectations in 4K TVs, but just as was the case with 3D TVs a couple of years ago, there will be virtually no 4K sources and content. And that can make for a customer backlash. There are some very detailed news on Amazon (see here) that provide an idea of where things stand.

What does all that mean for rugged mobile technology? Not all that much for now, but I am certain that the ready availability of super-high resolution on smartphones and consumer tablets will change customer expectations for rugged device displays just as capacitive touch changed touch screen expectations. Once the (technology) cat's out of the bag, that's it. It won't go back in.

And just as I finished this entry, I see that Dell announced 24-inch and 32-inch UltraSharp monitors with 4k 3840 x 2160 resolution, and a 28-inch version will soon follow (see Dell news). Given that Dell is the leading flat-panel vendor in the US and #2 in the world, that likely means that we'll soon see a lot more systems capable of supporting 4k resolution.

Posted by conradb212 at 04:53 PM | Comments (0)

November 14, 2013

State of Outdoor-Viewable Displays Late 2013

One of the big differentiating factors in ruggedized mobile computers is how well the display is suited for work outdoors in bright daylight and in direct sunlight. This can make the difference between a device being useful and productivity-enhancing, or frustrating and nearly useless.

Why is this such a big issue? Aren't today's displays so good that the only thing that matters is how large a display you want, and perhaps what resolution it should have? For indoor use that's true, but outdoors it's an entirely different story.

The outdoor viewability problem

Overall, LCD displays have come a very long way in the last two decades. If you're old enough, you probably remember those very small notebook displays that you could barely read. And if you looked at them from an angle, the color—if you had color—shifted in weird ways. Almost all of today's LCD displays are terrific. They are bright and sharp and vibrant, and you can view them from almost any angle (and depending on the technology, from all angles). Most of today's tablet and notebook displays are so good that it's hard to imagine they could get any better.

Until you take them outdoors, that is.

The difference between indoors and outdoors is amazing. A screen that is bright indoors will almost wash out when you take it outdoors on a sunny day. That's because even a very strong backlight is no match for the sun. Even very good displays become almost unreadable when they are facing the sun. The contrast goes away, the screen may go dark or it may become so reflective that it can't be used anymore. Some displays also assume strange hues and casts and colors when in the sun. Others have a shimmering iridescent look that distracts the eye. And resistive touch screens have a slightly yielding top surface that can make for optical distortions that can be very distracting.

Matte and glossy displays

Most notebook and tablet displays these days are very glossy. That's a trend that started perhaps a decade ago in Japan where vendors hoped bright, glossy screens would be more easily noticed in crowded electronics shop windows. Another argument for glossy displays was that they make the picture "pop" with rich color and sharp contrast when watching video or movies. That probably depends on the individual viewer, but overall glossy screens work well enough indoors (where there are few reflections) that virtually all manufacturers switched to them. Outdoors where there are a lot of reflections, glossy displays can be very hard to view.

Some tablets and notebooks have matte screens or they have anti-glare coatings. A "matte" surface can be achieved via a liquid coating with tiny particles that then diffuse light, chemical etching that makes for a rougher surface, or mechanical abrasion. The much reduced intensity of light reflection makes matte surfaces ideal for office workers. You'd think that matte displays are also the answer for good outdoor viewability, and matte displays can indeed handle reflections much better outdoors as well. The problem, though, is that matte screens, especially older ones, just diffuse the light. When that happens outdoors where light can be very strong, the screen can turn milky and becomes hard or impossible to read.

The different display technologies

Most of today's standard LCDs are transmissive, which means that you have a backlight behind the LCD. This approach works great indoors because the ratio between the strength of the backlight and the reflected ambient light is very large. Outdoors, the ambient light is much stronger, and so the ratio between the strength of the backlight and the amount of reflected light is much smaller, which means there is much less contrast.

In the past, notebook manufacturers tried different approaches to make the screens readable outdoors.

One approach was to use reflective LCDs instead of transmissive ones. This way, the brighter the sunlight, the more readable the display becomes. This never caught on for two reasons. First, since you couldn't use a backlight, you needed a sidelight to make the screen viewable indoors. That just doesn't work with displays larger than those in a PDA. Second, even outdoors, the screens looked flat because the LCD background was greenish-brown, and not white.

Another approach was "transflective" screens. Transflective screens were part transmissive so that you could use a backlight, but also part reflective so you could see them outdoors. This was supposed to be the best of both worlds, but it was really just a compromise that didn't work very well. So early transflective display technology was abandoned.

Today, most outdoor displays use what one might call modified standard transmissive technology. These screens perform just like standard transmissive displays indoors, while controlling reflections and preserving contrast outdoors. They do that with various tricks and technologies such as optical coatings, layer bonding, and circular polarizers to reduce reflected light. The overall goal of all these measures is to control distracting reflections and to get the best possible screen contrast. That's because for keeping a display readable outdoors and in sunlight, preserving display contrast is more important than anything else. That's where effective contrast comes into play.

Getac's QuadraClear brochure shows the effect of linear and circular polarizers

Almost all major manufacturers of ruggedized mobile technologies have their own special approaches such as "QuadraClear" (Getac), "CircuLumin" (Panasonic), "MaxView" (Handheld Group), "xView Pro" (MobileDemand), "View Anywhere" (Motion Computing), IllumiView (Juniper Systems), "AllVue" (Xplore Technologies), and more.

What matters is effective contrast

There are various definitions of contrast. The most important one in outdoor displays is the "effective" contrast ratio, which doesn't deal with the annoying mirror-like reflections glossy screens are infamous for, but rather with the sum-total of the light reflected by the various layers a typical LCD assembly consists of. The effective contrast ratio is the ratio between that reflected light and the light generated by the display's own backlight.

There are, in essence, two major ways to control those internal reflections. One is adding circular polarizers that combines a linear polarizer and a retardation film to block reflected light. The other is bonding together layers of the LCD assembly, thus eliminating two reflecting surfaces with every bond. How well it all works depends on exactly how these elements are combined to produce the best possible effect, and that's generally a closely-guarded secret.

A rule-of-thumb formula to compute the effective contrast ratio of an LCD screen used outdoor is 1 + emitted light divided by reflected light. For emitted light we use the backlight, measured in nits. For reflected light we multiply moderate sunlight, which is the equivalent of about 10,000 nits, with the percentage of light reflected by the display. A normal, untreated notebook screen reflects about 2% of sunlight. A combination of optical coatings can bring that number down to about half a percent for displays without touch screens, and about 0.9% for displays with (resistive) touch screens.

If you use this formula and plug in the numbers, you find that a standard notebook without any optical treatments has an effective contrast ratio of about 2:1, which means it's pretty much unreadable outdoors. If you boost the backlight or apply optical coatings, the contrast ratio goes up to about 6:1, which is the minimum the military requires in its computers. If you boost the backlight AND apply coating, you get contrast ratios of about 6 or 7 for displays with resistive touch, and up to 11 or 12 without.

The bottom line with this sort of "modified transmissive" displays is that there are a number of factors that affect the effective contrast ratio and thus display viewability outdoors. It all boils down to the best possible combination of optical coatings and layer manipulations to reduce internal reflection, and a good strong backlight.

Super-bright backlights

If that is so, then why not just have the strongest possible backlight? That is indeed one good way to boost the viewability of an outdoor display, but it comes at the cost of either a larger, heavier battery or less battery life. There are ways to minimize the extra drain on the battery, one being a light sensor that will throttle the backlight whenever full brightness is not needed, and another the presence of an easily accessible "boost" button that lets the user turn extra brightness on or off.

If you're wondering how backlights are measured, an often used unit is nit. The official definition of a nit is that it is "a unit of illuminative brightness equal to one candle per square meter, measured perpendicular to the rays of the source." Most consumer notebooks and tablets are in the 200 nit range, most semi-rugged and rugged device designated as "outdoor-viewable" are in to 500-700 nit range, and some "sunlight-viewable" screens go all the way to 1,000-1,500 nit.

Does a touch screen impact outdoor viewability

If it is resistive touch, yes, it usually does make a difference. That's because light reflects on every surface of the LCD assembly, and resistive touch adds additional surfaces, therefore lowering the effective contrast ratio. Another problem is that the soft, yielding surface of resistive touch screens results in distorted reflections that are not present in totally smooth glass surfaces.

Capacitive touch, which is increasingly used even in rugged devices, doesn't have this problem. However, it always pays to closely examine the display under all all sorts of extreme lighting conditions as some non-resistive technologies can have distracting grid patterns that become visible outdoors.

Hybrid approaches

In addition to reflective, transflective and transmissive screens and the various ways to tweak them for better outdoor performance, there are some interesting hybrid approaches. One of them is Pixel Qi's enhanced transflective technology where display pixels consist of a transmissive and a reflective part that have separate drivers. In essence, that allows them to have a "triple mode" display that can use one or both technologies, depending on the lighting situation. A while ago, RuggedPCReview had the opportunity to examine the Pixel Qi technology in a detailed review (see here), and we concluded that the technology absolutely works. However, potential users need to be aware of its inherent gradual switching from full color indoors to muted color outdoors and black and white in direct sunlight as that may affect color-dependent apps.

Most recently, Pixel Qi has come out with displays that do show colors in direct-sunlight reflective mode, but we have not have hands-on with any of those yet. Also of note is that Pixel Qi's founder and chief technology officer left the company early 2013 to work on the Google Glass project, and that's not good news for Pixel Qi.

How about OLED?

What about the OLED/AMOLED technology that is touted as the next big thing in flatscreen TVs, better than either LCD or Plasma? OLED screens have been used in some cameras, phones and even scuba diving computers for a while now, but I can't see them as a candidate for sunlight-viewable displays. That's because OLED technology is essentially a grid of special LEDs that easily washes out in sunlight or even bright daylight.

Other important considerations

On top of using the best possible outdoor-viewable display available for a given job, you also want good underlying LCD technology and good optics. A wide viewing angle makes the display easier to read, and we always strongly suggest starting with an IPS (In-Plane Switching, see Wiki) or, better yet, an AFFS (Advanced Fringe Field Switching, see wiki) screen so that viewing angles and color shifts simply aren't an issue. Some anti-glare coatings can create annoying reflections that can trick your brain into filling in detail, which makes the screen even less readable than it would be without the coating. You also don't want a display that reflects distorted images. That again confuses your brain and makes the screen harder to read. And you want a display that is as resistant to fingerprints as possible because fingerprints can become enormously visible and distracting under certain outdoor lighting conditions.

Examples from RuggedPCReview.com testing

Below are examples from RuggedPCReview testing that illustrate some of the issues and properties discussed:

Below: The first image shows pretty much how the modern era of optically treated transmissive displays got started around 2007 when Itronix introduced their DynaVue outdoor-readable display technology. The image shows how DynaVue compared to an earlier non-DynaVue Itronix notebook.

Below: All of a sudden it was possible to control reflection to an extent where displays remained viewable in direct sunlight. An older Itronix notebook with a matte display surface, by comparison, diffuses the sunlight so much that the screen is totally blown out.

Below: Unbeknownst to most, Dell was another pioneer with optically treated outdoor-viewable displays. In 2007, a Dell Latitude ATG 630 with its 500-nit backlight offered excellent outdoor viewability with an 11:1 effective contrast ratio. Dell did that by bonding a piece of glass on the polarizer film, thus eliminating the polarizer film's reflectivity, and then treating the smooth exterior surface of the glass.

Below: Comparison between a "modern-style" optically treated transmissive display on an Xplore rugged tablet, and the matte display on a competing product. Indoors both tablets looked great, but outdoors the difference was obvious.

Below: The same Xplore iX104 tablet versus one of the original convertible Tablet PCs. The matte, non-treated 2002-era Toshiba Portege display simply vanishes outdoors.

Below: Matte displays can work quite well outdoors; this DRS ARMOR tablet mutes the reflection and remains quite viewable, though you have to hunt for the right viewing angle.

Below: That's a Getac rugged notebook on the left versus a Gateway consumer notebook with a glossy display that looked great indoors and even had decent contrast outdoors, but the glossy screen made it unusable.

Below: A GammaTech SA14 semi-rugged notebook compared to an Apple MacBook Pro. Indoor the MacBook excels, outdoors it falls victim to reflections.

Below: Here we see how a rugged MobileDemand T7200 xTablet compares to a Google Nexus 7 consumer tablet. Same story: indoors the Nexus screen looks terrific, outdoors it's all reflections, although the display itself remains quite viewable.

Below: Motion CL910 tablet vs. iPad 3—the iPad's retina display is terrific and is even viewable outdoors, but the super-glossy surface is very prone to reflections.

Below: MobileDemand pioneered using a modified Pixel Qi display in a rugged tablet, their xTablet T7200. Note how the display works under all lighting conditions, from indoors to direct sun where the display switches to gray-scale reflective.

Below: An example of the interesting optical properties of treated outdoor displays. The two Motion Computing tablets are both excellent outdoors, and have some of the best displays anywhere, but it's clear that they have different optical treatments.

Below: Another example of the odd optical properties of some displays. This one is basically quite viewable, but the wavy, distorted reflections make it difficult to use.

Below: An example of brute force—the Getac X500 rugged notebook has a superbright backlight that, combined with very good optical treatment, makes this one of the best displays available.

So what's the answer?

While there are a number of interesting alternative display technologies, at this point, the best overall bet is still a combination of optical treatments and coatings, direct bonding to reduce the number of reflecting surfaces, a reasonably strong backlight, and a touch technology with as little impact on display viewability as possible. The following all contribute to the best currently possible outdoor-viewable display:

  • IPS or AFFS LCD (for perfect viewing angle)
  • Anti-reflective and anti-glare treatments
  • Circular polarizers (combination of a linear polarizer and a retardation film to block reflected light)
  • Minimal number of reflective surfaces via direct bonding
  • Sufficiently strong backlight
  • Hard, flat surface to eliminate distortions
  • Suitably high resolution
  • Touch technology that does not show, distort or create optical aberrations
  • Surface that's not prone to fingerprints

As is, most major vendors of rugged mobile computing technology offer outdoor/sunlight-viewable displays standard or as an option. Most perform quite well, though there are significant differences that can really only be evaluated in side-by-side comparison, as the industry does not generally disclose exactly how displays are treated. Such disclosure, and also the inclusion of effective contrast ratio into product specs would be tremendously helpful. That, of course, would require generally agreed-on standard definitions and testing procedures.

The last word in outdoor-viewable display technology has not yet been spoken, and it's more than likely that disruptive new technologies will replace what's available now. However, today's technology has reached a point where it can be good enough to allow work in bright daylight and even direct sunlight, though it definitely pays to see for yourself which particular implementation works best for any given project. -- Conrad H. Blickenstorfer, November 2013

(For the definite article on the modified transmissive approach, see "GD-Itronix DynaVue Display Technology" by Geoff Walker. It goes back to 2007, but all the principles remain valid today. Thanks also to Mr. Walker for feedback and suggestions for this article).

Posted by conradb212 at 10:51 PM | Comments (0)

October 23, 2013

Two annoying trends

Today I am going to rant a bit about two trends that simply make no sense to me.

The first is "skeuromorphism." It's the new fashion word-du-jour, what with Apple and Microsoft demonising it as if it were some sort of evil plague. As is, Wiki defines skeuromorph as "a derivative object that retains ornamental design cues from structures that were necessary in the original." That includes, of course, many elements of graphical user interfaces. The desktop metaphor, after all, has been at the very core of every graphical user interface for the past 30 years.

But now that very quality, to make objects on a computer screen look just like the real thing, has come under heavy attack. And that even includes the three dimensional nature of the real world. Apple, especially, but also Microsoft, now want everything to be flat. As flat as possible. Anti-skeuromorphism forces argue that the public has now been exposed to computers long enough to no longer need the analogy to real, physical things. And so, in the eyes of many, the latest versions of many operating environments, and Apple's iOS, look dreadfully flat and barren indeed.

In Apple's case, one could well argue that a bit of a backlash against skeuromorphic excess was probably a good thing. Apple, long the champion of good design, had begun slipping with some truly kitschy stuff, like the "leather-bound" address book, the Ikea-style wooden bookshelf and other affronts that likely would have had a healthy Steve Jobs froth at the mouth. So a bit of streamlining things was in order, but when I look at the sad, sparse, flat expanse and eensy-tiny lettering that now mars the iOS and many of its apps, the sad icons that look like they just want to vanish from view, and the rest of the bleakness that now adorns iPhones and iPads, I wish Jonathan Ive and colleagues would have stuck with hardware.

You could argue, of course, that after decades of visual improvements and fine-tuning, the anti-skeuromorphism crusade simply rings in the advent of fashion in electronic interfaces. Just like fashion goes into extremes just to then denounce the trend and swing into the other extreme (neatly obsoleting billions of dollars worth of product in the process), perhaps we'll now have to put up with anemic, flat computer and tablet screens until the trend reverses and everything becomes three dimensional and lively again.

Okay, the second trend is that to thin and slender hardware design at all cost. The just announced new Apple iPad Air is hailed as a wondrous achievement because it's thinner yet and weighs even less. It's a veritable Barbie of a tablet. And since this is Apple, and Apple decreed some years ago that computing devices need to be rectangular and very flat, we now have hyper-slim smartphones, hyper-thin tablets, and even hyper-thin iMacs, which in the latters' case makes absolutely no sense since they sit on a desk in front of you. And we also have hyper-thin HDTVs. Size is okay as we now have smartphones with screen sizes approaching 6 inches and flat screen TVs approaching 90 inches. But it all must be very flat and thin.

Why?

I mean, making that technology so very flat simply makes it more difficult to design and manufacture, and since hardware happens to be a physical thing it often loses utility if it's pressed into too flat of a design (the new iPad Air's battery is down to 32.9 WHr, vs. the iPad 4's 43 WHr). The dreadful sound of flat-screen TVs is a prime example, and the so-so battery life of many super-slim smartphones another. More and more the trend to supreme thinness seems more a narcissistic quest to prove that one's technology is so advanced that mere physicality no longer matters. Sort of like a supermodel starving herself into a skeletal, gaunt appearance just to be lauded for her discipline and elegance.

It makes no sense. I mean, the latest Crossover probably weighs almost 5,000 pounds, a house weighs an awful lot, and American people weigh more all the time, too. So why is ultimate thinness in an electronic device such a virtue?

And it especially makes no sense for rugged devices where the very physicality of the design provides the structure and toughness to make it last on the job. And where a degree of volume means it'll run cooler and provide space for expansion and versatility. Yet, even rugged device are getting thinner all the time. They have to, or the public, even customers in enterprise and industrial markets, will stay away.

So there, two silly trends. And trends they are, as you can't keep making physical stuff thinner beyond a certain point. Once that point is reached, the quest is over, and the pendulum will reverse or go elsewhere. It's quite possible that the next Steve Jobs will some day present the latest super-gadget, and it will be egg-shaped. It's possible.

Be that as it may, I just hope that technology will stay as free from fashion dictates as possible. Because once it takes fashion to sell gear, that means innovation is over.

Posted by conradb212 at 08:21 PM | Comments (0)

October 18, 2013

Rugged Android device comparison table, and contemplations over Android in the rugged market

On October 18, 2013, RuggedPCReview.com launched a rugged Android device comparison table. The table allows interested parties to view full specifications of all rugged handhelds and rugged tablets we're aware of.

Given the absolutely massive number of Android devices activated worldwide -- about a billion -- it's amazing how few rugged Android devices are available. As we recently reported, both Honeywell/Intermec and Motorola Solutions have launched initiatives to make rugged Android devices available to industrial and enterprise markets, and other manufacturers are offering ruggedized Android-based handhelds and tablets as well. But there aren't many actual devices, probably less than a couple of dozen in both categories combined. And even that small number includes products that are available with either Android and one version of Windows Mobile or another, which means they aren't really optimized for either.

Add to that the fact that few of the available Android-based rugged devices are on the latest, or even a recent, version of Android, and that much of the hardware isn't anywhere near the technological level of consumer smartphones and tablets, and one has to wonder all over again why Android has such a terribly hard time to get going in rugged/industrial devices.

On Microsoft's website you'll find a white paper entitled "Choose Windows Mobile Over Android for Ruggedized Handhelds" written by Gartner in February 2011 (see here). Among the key recommendations there were to "remain with Windows Mobile for ruggedized handheld-computer solutions, and to prepare for a transition to full Windows in subsequent implementations" and to "limit the scope of Android-based ruggedized application development through 2013." Of course, the two and a half years since Gartner issued the paper is an eternity in mobile electronics. At the time they still mentioned Android as the #2 smartphone OS behind Symbian!

Gartner also cautioned that the lack of FIPS-140 compliance (FIPS 140 is a U.S. government computer security standard that specifies cryptographic requirements) was an issue for Android, and they predicted that enterprise software vendors would probably create HTML5-based client applications with cross-platform abstraction layers to provide some support of Android devices. FIPS-140 compliance was indeed an issue with Android, and one that's even now still only addressed on a by-device level. Cross platform application development is now available via platforms such as RhoMobile and iFactr.

I don't know how widely read Gartner's 2011 report was, but over the past three years the rugged computing industry certainly heeded Gartner's advice of choosing Windows Mobile over Android for ruggedized handhelds. Gartner's 2011 arguments made sense, but probably even Gartner didn't foresee that the installed base of Android devices would grow from under 200 million back in 2011 to a cool billion today. One could easily argue that playing it safe with Windows Mobile precluded participating in the rapid, massive learning curve with Android over the past two or three years.

There are no good answers, and hindsight is always 20/20. Except that even two or three years ago it was quite obvious that Windows Mobile was doomed, and Microsoft did not seem to have a compelling roadmap in the mobile space. In historic terms, the predicament the rugged handheld and tablet manufacturers have been facing over the Android issue is no less harrowing than the predicament they faced a decade and a half ago when there was increasing pressure to abandon their various proprietary operating platforms in favor of Windows CE.

What's the answer? Hard too say. It is impossible to ignore a user base of a billion and counting, because that billion already knows Android and how it works. On the other hand, Android's fragmentation is vexing, there remain questions about platform security (see overview of Android security), and the fact that Android' was as clearly designed for capacitive multi-touch as Windows was for a mouse makes it less than perfect for gloves and wet places. At this point it is also still possible that Microsoft might somehow pull a rabbit out of its hat with Windows Embedded 8 Handheld, causing a percentage of the rugged mobile market to go one way and the consumer market another. Remember that the Palm OS once dominated the mobile OS market to an extent where Symbol (now part of Motorola Solutions) had a Palm-based industrial handheld (see here) before the tide went the other way.

Posted by conradb212 at 06:28 PM | Comments (0)

October 02, 2013

October 1, 2013 -- the day Moto Solutions and Honeywell/Intermec became serious about Android

This has been quite the day for Android in the rugged handheld space.

Intermec, now part of Honeywell Scanning & Mobility, announced the CN51 rugged mobile computer. It is an updated version of Intermec's successful CN50, but has a larger, higher resolution screen (4.0-inch, 800 x 480) that now uses resistive multi-touch, updated WiFi, WAN, Bluetooth, camera, and scanners, and it's now based on the dual-core 1.5GHz TI OMAP 4 processor, which means Intermec can offer the CN51 either with Microsoft Windows Embedded Handheld 6.5 OR with Android 4.1.

And Motorola Solutions, the same day, announced that three of its popular enterprise mobile computers would now be available with Android Jelly Bean, fortified with Moto's own Mx security, device management and performance features that they added to the standard Android OS. So as a result, the following three Motorola Solutions devices will now be available with Android Jelly Bean:

Motorola Solutions Android mobile computers
Product ET1 MC40 MC67
Type Enterprise tablet Rugged Handheld Enterprise PDA
Display 7-inch (1024 x 600) 4.3-inch (480 x 800) 3.5-inch (480 x 640)
Digitizer Capacitive multi-touch Capacitive Resistive
Original OS Android Android Embedded Handheld 6.5
Available OS Android Jelly Bean Android Jelly Bean Android Jelly Bean
RAM 1GB 1GB 1GB
Storage 4GB Flash + microSD 8GB Flash 8GB Flash
Size 8.8 x 5.1 x 1.0 5.7 x 2.9 x 0.8 6.4 x 3.0 x 1.3
Weight 1.4 lbs. 9.4 oz. 13.5 oz.
CPU 1GHz OMAP 4 800MHz OMAP 4 1GHz OMAP 4
Scanning via 8mp camera SA4500-DL SE4500-SR/DL/DPM
WWAN Data-only NA Voice and data
Op. temp 32 to 122F 32 to 122F -4 to 122F
Sealing IP54 IP54 IP67

Motorola Solutions points out that their three Android offerings are not only benefitting from the Mx extensions (see Mx website), but also from the company's RhoMobile Suite (see RhoMobile page), a cross-platform development tool used to create apps that are OS-independent. Which means the Moto Android devices can run standard Android apps, or HTML 5 cross-platform apps created with RhoMobile.

Here's what Motorola Solutions had to say about the emerging Android devices:

And below is Intermec's introduction to their CN51 that can run both Windows Mobile and Android:

What does it all mean, this pretty high visibility push of Android in business and enterprise class devices? Well, there's the odd situation that while almost all smartphones are either iPhones or Android devices, virtually all industrial handhelds still run one version of Microsoft's old Windows CE or Windows Mobile or another. Which means that most industrial handhelds are by no means ruggedized equivalents of the smartphones a good part of humanity already knows inside out. Which results in extra training, an extra learning curve, and the near certainty that change will come anyway. Up to now, rugged mobile computing manufacturers have been remarkably conservative in acknowledging that trend, and generally limiting themselves to offering an exploratory Android version or two of a Windows device running on similar hardware. What we're now beginning to see is the next step, that of making the hardware so that it can run both old and new.

Now, no one wants to alienate Microsoft, of course, and so things are worded carefully. Intermec's press release includes a quote from industry analyst David Krebs, VP of mobile & wireless at VDC: "While rugged devices are designed to be more function or application-specific than smartphones, there is growing consensus that these devices deliver a similar immersive experience and have similar capabilities as function-rich smartphones. As Android matures in the enterprise it represents an increasingly viable option for rugged vendors such as Intermec to bridge this functionality gap and deliver the capabilities their partners and customers are looking for."

On the Motorola Solutions side, Girish Rishi, Senior VP of enterprise solutions, said, "Now, businesses have more choices and can better manage rapid changes in the market by using Motorola’s tools that deliver complete solutions in less time while protecting their mobile investments.”

It's fairly safe to assume that these are just first steps. The proposed hardware still represents compromises and is not (yet) truly Android optimized. But the message communicated by both Intermec/Honeywell and Motorola Solutions is quite clear: We can't wait any longer, Microsoft. We need to get with the program before we lose our markets to consumer smartphones in a case.

Posted by conradb212 at 12:09 AM | Comments (0)

August 11, 2013

Optimizing the legacy Windows interface for touch and tablets

Tablet computers have been around for a quarter of a century, but it wasn't until the iPad's introduction that the tablet form factor took off. That's in part because the technology wasn't quite ready for tablets, and, in a much larger part, because Windows just didn't work well with tablets. Tablet historians will remember that both the first generation of tablets (circa 1992) and the second one (circa 2001) primarily used Wacom active digitizer pens. Those pens were (and are) precise enough to operate the Windows user interface, especially when aided by special Windows utilities and accommodations (Windows for Pen Computing back in 1992, and the Windows XP Tablet PC Edition in 2002).

Ever since the iPad came onto the scene, the market has been demanding touch instead of pens. Touch works great with operating environments designed for touch, such as iOS and Android, but not nearly as well with Windows. As of late, we've seen a number of new Windows tablets that use either resistive touch or even projected capacitive touch. Resistive works best with a stylus, though a firm finger touch usually also works, albeit not very precisely. Capacitive touch and legacy out-of-the-box Windows, however, are not a great match (except for the new Metro environment in Windows 8).

There's not much that can be done to remedy that situation. Many people want and need Windows on a tablet, but in order for it to work like Windows, it needs the full Windows user interface, and that is simply not designed for pen and touch. As a result, if you work on a tablet with a small screen and limited resolution, your screen may look like it does in the picture below:

The small scrollers, check boxes, menus and text work perfectly well if the tablet is used with a mouse, but they are almost impossible to use with touch.

Fortunately, there are ways to improve the situation, if only to some extent. And that is done through customization of the Windows interface. Here's how it's done.

In Windows 7, go to Control Panels, then select Personalization. Go to the bottom of the screen and click on Window Color. At the bottom of that window, select Advanced Appearance Settings... What then shows up is the Window Color And Appearance tab that has been around for many years. It lets users adjust the size of numerous Windows interface items.

The "Item" pulldown provides access to:

3D Objects (lets you select the color)
Active Title Bar (select size and font)
Active Window Border (lets you select the color)
Application Background (lets you select the color)
Border Padding (select size)
Caption Buttons (lets you select the color)
Desktop (lets you select the color)
Disabled Item (lets you select the font color)
Hyperlink (lets you select the color)
Icon (select size and font)
Icon Spacing (Horizontal) (select spacing)
Item Spacing (Vertical) (select spacing)
Inactive Title Bar (select size, color and font)
Inactive Window Border (select size and color)
Menu (select size, color and font)
Message Box (select font, font size and color)
Palette Title (select size, font and font size)
Scrollbar (select size)
Selected Items (select size, color, font, font size and color)
ToolTip (select color, font, font size and color)
Window (select color)

Here's what it looks like:

The default values for all of those items are, surprise, what works best for a desktop computer with a mouse and a large screen. Problem is, those default desktop/mouse settings are also what virtually all Windows tablets come with. It's as if vendors thought Windows worked equally well on any size and type of platform, which it definitely does not.

As a result, Windows, which isn't very suitable for tablets in the first place, is even worse with the desktop/mouse default settings. So how are we going to remedy that situation? Not easily, but it can be optimized to work better on touch tablets.

Colors, obviously, don't make a difference. So let's forget about all the interface items where you can only change color. Icon size and spacing is also pretty much a matter of choice, as is font color, so let's drop those as well. That leaves:

Active Title Bar (select size and font)
Border Padding (select size)
Inactive Title Bar (select size, color and font)
Inactive Window Border (select size and color)
Menu (select size, color and font)
Message Box (select font, font size and color)
Palette Title (select size, font and font size)
Scrollbar (select size)
Selected Items (select size, color, font, font size and color)
ToolTip (select color, font, font size and color)

Inactive items also don't matter, and neither does stuff you don't actually interact with, so let's drop those. That leaves:

Active Title Bar (select size and font)
Border Padding (select size)
Menu (select size, color and font)
Palette Title (select size, font and font size)
Scrollbar (select size)
Selected Items (select size, color, font, font size and color)

Now we get to what matters. Here's why and how:

Active Title Bar -- contains the three all-important boxes that minimize a window, maximize it, or close it. Those are used all the time. They must be large enough to easily operate with touch or a stylus. (On a 7-inch 1024 x 600 pixel display, I set title bar size to 32 and font size to 11).

Border padding -- this one is annoyingly important. It's important because your finger or stylus must connect with the border to resize it. It's annoying because a border large enough to easily manipulate is an eye sore and takes up too much space, making Windows look clumsy.

Menu -- not really, really important, but you'd like to be able to see the menu choices, so make them large enough. I used 32 for size, and 11 for font size.

Palette Title -- honestly, I don't know why I left this one in there. I set it to 32 size and 11 font size.

Scrollbar -- perhaps the most important one of all. If you need to scroll up and down and left and right with your finger or a stylus, it MUST be large enough to easily touch. I made it 32, and the font 11.

Selected items -- that's the choices row when you select something from a menu. It needs to be large enough to read and select from via touch. Made it 32 and the text 11.

So there. Once you've done this, Windows won't be that terribly difficult to operate with your fingers or a stylus. It's not going to look very pretty, and you'll use up far too much valuable screen real estate with interface items designed for a mouse, and now resized to make them work as well as they possibly can with touch.

And here's what it looks like. Note the much larger scrollers, menus, window borders and text:

Another good thing to know is that you can save those settings as themes (see image below). Which means you can quickly toggle between a theme that's optimized for use with a mouse (or when the tablet is hooked up to a larger external screen), and when it is used as a tablet with finger touch.

Can that be done in Windows 8 as well? Not as easily. The "Advanced Appearance Settings" is gone from the Windows 8 (legacy) Control Panel. To change the interface, users can change text size in the Display control panel. But everything else can only be adjusted with the Registry Editor, which is not for the faint or heart (see how it's done).

Posted by conradb212 at 03:48 PM | Comments (0)

June 25, 2013

Logic Supply's logical approach to engineering their own systems

When it comes to rugged computing gear, most people interested in this industry know the big players that dominate the market and get all the media coverage. But that's not everything there is. Unbeknownst to many outside of the circle of customers and prospects, a surprising number of smaller companies are designing and manufacturing rugged computing systems of one type or another. At times we come across them by chance. Other times they find us.

And so it was with Logic Supply, located in South Burlington, a small town in the northwestern part of Vermont. They call themselves "a leading provider of specialized rugged systems for industrial applications," and asked if we could include them in our resource page for rugged system vendors. We could. And that led to some back and forth where I learned that while Logic Supply distributes a variety of rugged/embedded systems and components, they have also begun developing their own high-end chassis under their own LGX brand. That was interesting, and so we arranged an interview with Rodney Hill, Logic Supply's lead engineer, on the company, and how they go about creating their own, home-developed solutions in addition to being a distributor of products engineered elsewhere.

RuggedPCReview: If you had to describe Logic Supply’s approach to case and system engineering in a minute or less, what would you say?

Rodney Hill (Logic Supply): So, LGX is Logic Supply’s engineering arm. The design approach for LGX systems and cases can be boiled down to three ideas. First is our “designed to be redesigned” philosophy. Seed designs that are scalable and modular. From a seed idea we can create a product line through swappable front- and back-plates or resized geometry. Second is mass customization — by using standardized screws, paints, sheet metal folds, and design concepts, we leverage mass produced hardware whenever possible to keep the cost low. And through our modular designs we customize the off-the-shelf by using “upgrade kits,” which are quick to source and are cost-effective. Finally, innovation, not invention! There is a difference. Add value to things that work well, but do not re-invent the wheel.

Was that under a minute?

RuggedPCReview: Almost. But now let’s expand. You said scalable, modular, and “designed to be redesigned.” What do you mean by this?

Rodney Hill (Logic Supply): Designing a new chassis is four to five times the price of redesigning a seed design. Much of the time wasted in projects is done so while selecting paints, screws, boxing, foam, metal fold designs, etc. By using standardized design methods and seed concepts, our team can immediately start adding value. Ultimately the customer is only paying for the design and not the time the engineers spent trying to get their act together. We will be faster and more focused on quality and containing costs and risk.

So, to your question, designed to be redesigned systems from LGX have already incorporated flexible features to accommodate 80% of the customizations that customers request with off-the-shelf hardware. The last 20% are resolved with ‘upgrade’ kits that will be included with the off-the-shelf chassis kit. But you’re also using the proven benefits of the rest of the chassis (EMI and RFI shielding, for instance), and only adding risk in small portions. Meaning the rest of the chassis is still meeting all the same design criteria it was originally intended to support. So you can easily customize it without the risk of any negative effects on any of those features.

In terms of scalability versus modularity — there are design themes seen in our cases. If you look carefully enough, you can begin to see connections between designs. The NC200 and the MK150 are two totally different designs – however they share about 80% of the same DNA, from vent holes to metal folds, etc.

RuggedPCReview: How does cooling play into ruggedization?

Rodney Hill (Logic Supply): Nature always wins. Meaning dust and water will destroy everything if given the chance. You need to decide how long the computer needs to live, and how much you’re willing to pay for it. Heat will shorten the life of components.

So in terms of chassis design concepts: Keep the chassis cool as possible and as quiet as possible. Intelligent design is required to incorporate standardized cooling methods and proven airflow paths to cool many types of devices. Fan diameter, placing, vent design all will have effects on the acoustic design as well. Logic Supply will engineer noise out of systems with fan-muffling technologies, maximizing air throughput with smaller, more simple fans by identifying inefficiencies in orifice design. In short, having a fan against a grille will kill 50-60% of the airflow and multiply the noise by two or three times.

Vent holes equal dust. Dust causes fans to break, which in turn results in hot computers. Eliminate vents and go fanless. The operational temperatures and ruggedness greatly increase. Logic Supply defines “fanless” different than the IMP mass market. Our definition is not simply “no fans.” It is more than that: no fans, no vents for dust and dirt, and maintain the ability to cool the computer system at 100% duty load for hours and days at a time. We want these systems to be heavy duty, and also to be able to last a long time. It is rated for high performance!

RuggedPCReview: Can you talk about the design process? How long does it take from start to finish?

Rodney Hill (Logic Supply): It happens pretty fast. This year we’ve done a fanless Mini-ITX case, a 1U rackmount case, a NUC [Next Unit of Computing] case, and we’re finalizing a fanless NUC case right now. We’ve also finished a number of customer-specific designs. These design concepts typically originate in sales — you know, this customer wants to do X and none of our existing solutions do it. But because we use seed designs, we don’t start from scratch. It really all depends, but usually designs take under three weeks, and prototypes are ready a few weeks after that. We review, test, and modify, then we’re typically getting production units in-house around five or six weeks after that.

These core platforms can then be sold off-the-shelf, or customers can either go the semi-custom route or more radically modify the design. For simple modifications (like back-plates, front-plates, and simple changes) maybe one to five days in design and a three to four day lead on parts. For customized chassis design with samples, five to six weeks, and four to six after that to mass production.

RuggedPCReview: Alright, finally, can you give us an example of a successful customer product development?

Rodney Hill (Logic Supply): Sure. Last year we worked with a company called StreamOn to make a custom appliance with off-the-shelf components. StreamOn offers streaming audio solutions for the radio broadcast industry. The hardware they were using at the time was going End-of-Life, and they also needed a more specialized embedded system because their business was growing and they wanted to offer more features to their customers. They needed a variety of other things — outsourced fulfillment and things like that — but from an engineering perspective it was mostly that — the EOL and specialization. And all while remaining affordable for their customers. We worked from an existing system design — the ML250 — and customized it toward what they needed. We added an SSD, LCD screen and multifunction buttons, and on-case branding.

Ultimately, the system we created was something like 30% smaller, and it was fanless, so it was more efficient, and had a longer life expectancy. It also had a built-in watchdog timer and auto restart bios so it could avoid any complications related to sudden power outages, etc. And it actually ended up being even less expensive for their customers than what they were previously offering. So that all worked out quite well. In fact, they recently won the [Radio World] “Cool Stuff Award,” which was pretty, well, cool!

This whole process was consistent with our typical design timeline, by the way. From the initial conversations to mass production — with samples and prototyping — we took about three months.

Posted by conradb212 at 03:16 PM | Comments (0)

June 24, 2013

Why the JTG Daugherty NASCAR racing team chose rugged Dells

The Christmas tree began its count-down. Yellow, yellow, yellow, GREEN! For an anxious moment, the racing slicks of my supercharged Acura fought for traction, then bit. 8,000, 8,500, 8,800 rpm, shift. Shift. Shift. Shift, and the 1/4-mile at Sacramento Raceway was over. The car slowed and I reached over to stop data logging on the laptop securely sitting on its mount, just having recorded tens of thousands of data points as the car shot down the track. The laptop was a Dell Latitude ATG 630D, connected via USB to the Hondata ECU under the dash of the car. Minutes later I would analyze the run on the Dell, temperatures, shift points, slippage, air/fuel ratio, knocks, timing, etc., and then make changes on the fly. The next heat was in less than 15 minutes.

At the time I didn't know that a few years later I'd be talking with the JTG Daugherty NASCAR racing team about how they used rugged Dell laptops on their #47 Sprint Cup car, driven by NASCAR legend Bobby Labonte. Labonte won the Cup in 2000 during an era where he was a perennial contender. And also won IROC in 2001, following in the footsteps of his brother Terry Labonte, also an IROC and Cup champion. Now a senior amongst NASCAR drivers at age 49, Labonte's piloting car #47 for the team owned by Jodi and Tad Geschickter, and NBA Hall of Famer Brad Daugherty. Lady Luck hasn't been too kind to them this season, but that's certainly not due to this talented group and also not due to the technology they're using. Most recently, while Martin Truex Jr. won at Sonoma Raceway in his Toyota Camry, a blown oil cooler ended Labonte's race in essentially the same Camry before it even began. Those are the breaks.

So I felt almost guilty when I got on the phone Monday morning after that race with Matt Corey, who is the IT administrator at JTG Daugherty Racing, and Dell's Umang Patel and Alan Auyeung to discuss JTG Daugherty's use of Dell technology. Corey in particular, probably didn't feel too good after the frustrating weekend and had plenty of other things to do at their shop, but he took the call. Much appreciated.

So how is JTG Daugherty Racing using Dell computers? And what made them decide to use Dell from the driver to the garage and the pit crew to the data center with a complete suite of Dell technology and solutions that also includes rugged Dell ATG and XFR laptops? The choice of Dell for data center and office isn't much of a surprise, given that Dell has consistently been in the top three PC vendors worldwide and in the US. What's more interesting is that JTG Daugherty also chose Dell for their rugged laptops, a field dominated by Panasonic, Getac and a number of other vendors specializing on rugged equipment.

Corey began by explaining the inherent need of a NASCAR racing team for rugged technology. No surprises here. There's rain, dust, vibration, extreme temperatures, the whole gamut of hazards rugged mobile computing gear is designed and built to survive. Add to that the extreme time crunch as a race car is tested and prepared, the extreme need for absolute reliability in a sport where fractions of a second matter, and a race car is checked, refueled and has all of its tires changed in something like 13 seconds. Things simply must not go wrong, ever, in such an environment, and racing teams certainly cannot put up with finicky computing technology that may or may not be up to the job. As an example, Corey tells of an incident where a consumer laptop simply wasn't able to handle vibration, causing them a lot of grief.



So as a result, JTG Daugherty now uses rugged gear. Their race engineering team has Dell Latitude E6430 ATG laptops. The ultra-rugged Dell Latitude X6420 XFR is used on the truck and trailer. They also use Windows-based Dell Latitude 10 tablets in Griffin Survivor cases supplied by Dell. All of this means that the team can collect performance stats, analyze it, and make changes quickly and reliably. "We have connectivity everywhere," said Corey. "As the car chief makes a decision about a change to the car, for example, he now notes this on his Latitude 10 and the information is instantly communicated to everyone across the organization. All decisions from the car chief trickle down to updates to the racecar and with everyone synced together with tablets and other Dell technology, that information flow is now much faster, more reliable and more efficient."

But still, why Dell for the rugged gear? Here I expected Corey to point to the advantage of dealing with a one-stop vendor. Instead he says that they had used Toughbooks in the past and liked them, but that "they really didn't change much over the years, same always," and that Dell updates more frequently. "We don't want "plain vanilla," he said, "we want to be on the cutting edge of technology" and lists the memory and processing speed required to power through race simulations, high resolution imaging, and massive data sets.

Staying at, or near, the leading edge in technology while still adhering to the longer purchase cycles and life spans of rugged equipment, and guard against obsolescence of docks, peripherals, accessories and software has always been a challenge for the rugged computing industry. While Intel tends to unveil a new processor generation and ancillary technology every 12 to 18 months, the rugged industry cannot possibly update at the same pace. Even Dell is not immune in that regard; as of now, the rugged XFR laptop is still based on the E6420 platform.

Yet, Dell does have the advantage of very high production volume and with that comes access to the very latest technology. Combine that with the convenience and peace of mind of dealing with a large one-stop shop, and it's no surprise that even a NASCAR racing team chose Dell.

See NASCAR Team Selects Dell to Speed Past the Competition, Dell's Latitude for Rugged Mobility page, and RuggedPCReview.com's most recent reviews of the Dell Latitude ATG and Dell Latitude XFR.

Posted by conradb212 at 07:47 PM | Comments (0)

May 28, 2013

Rugged notebooks: challenges and opportunities

I've been working on setting up our new rugged notebook comparison tool over the past few days. So far, the tool, where users can compare the full specs of up to three rugged notebooks side-by-side and also quickly link to our analysis of the machines, has far fewer entries than our comparison tools for rugged handhelds and rugged tablets. As I asked myself why there were only relatively few products out there, I thought about the overall rugged notebook situation.

A little while ago I came across a news brief by DigiTimes, the Taipei-based tech news service that's always interesting to read (albeit not always totally accurate). The news item was about Getac gunning for an increased market share in rugged notebooks. Digitimes said the current worldwide rugged notebook marketshare situation was something like Panasonic having 60%, Getac and General Dynamics Itronix each about 12.5%. They didn't specify the remaining 15%, but it's obviously a number of smaller players.

That news came just a short while after General Dynamics officially pulled the plug on Itronix, so those 12.5% that used to be GD-Itronix rugged notebooks such as the GD6000, GD8000 and GD8200, are now gone and up for grabs. Who will step up to bat? Will Getac take over what GD-Itronix used to have? Or will Panasonic's Toughbooks get even more dominant? Or will perhaps someone else emerge?

There's no easy answer. And the market is a rather fragmented one. First, it's not totally clear what makes a notebook "rugged." In the business we generally differentiate between "rugged" and "semi-rugged," where the more expensive fully rugged devices carry better sealing and are built to handle more abuse than semi-rugged models that offers somewhat less protection, but usually cost and weigh less in return. But rugged and semi-rugged are not the only designations you see in the market. Some manufacturers also use terms such as "business-rugged," "vehicle-rugged," "durable," or even "enterprise-rugged." There's also "fully-rugged" and "ultra-rugged."

Of machines on the market, we'd consider products such as the Panasonic Toughbook CF31, Getac B300 or GD-Itronix GD8200 as rugged, and the Panasonic Toughbook 53, the Getac S400 and the GD-Itronix GD6000 as semi-rugged. But then there are also notebooks specifically made for enterprise and business that are better made than run-of-the-mill consumer notebooks, but somehow defy definition. Examples are the very light magnesium notebooks by Panasonic that cost a lot more than any regular laptop and can take much more abuse, but do not look tough and rugged.

Then there's yet another category of laptops that are almost exclusively used for business and vertical market applications, and that's the convertible notebooks. These had their origin when the industry was intrigued by tablets in the early 1990s and then again in the early 2000's, but wasn't quite sure if customers would accept them, so they made something that could be used both as a tablet and as a laptop. These usually cost more than notebooks, and were heavier than tablets, but somehow the concept is still around, and there are many models to choose from. Some are fully rugged, such as the Getac V100/V200 and the Panasonic Toughbook 19, others are semi-rugged like the Panasonic Toughbook C2, or business-rugged, such as Lenovo ThinkPad X230t or the HP EliteBook 2760p.

Yet another category is rugged notebooks that are based on large volume consumer notebooks. Examples are the semi-rugged Dell Latitude ATG and the fully rugged Dell Latitude XFR. With Dell having quick and easy access to all the latest technology, the ATG at least is almost always at, or close to, the state-of-the-art in processors and other technology.

And there further twists. While the likes of Panasonic and Getac make their own notebooks, a good number of others are made by a small handful of OEMs under exclusive or (more often) non-exclusive agreements with resellers that put their own brand names and model numbers on the devices. Taiwanese Twinhead, for example, had a longstanding relationship with the now defunct General Dynamics Itronix, with some models exclusive to Itronix and others marketed by various vendors through different channels. That can make for interesting situations. While Twinhead was and is an important OEM, they also sold their mostly semi-rugged lineup under their own name and the Durabook brand, and also through their US subsidiary GammaTech.

But there's more. A number of smaller players, or small parts of larger industries, provide highly specialized rugged notebooks that are often so unique as to only target very narrow markets. Some machines are built specifically to the requirements of military and other government contracts. Their names and brands are usually unknown to anyone outside of the small circle of targeted customers.

Why are there so few rugged and semi-rugged notebooks? One reason is that the market for them isn't all that large. They are popular in police cars and similar applications, and wherever notebooks simply must be much better built than fragile consumer models. Another reason is price. Even relatively high-volume semi-rugged laptops cost two to three times as much as a similarly configured consumer model. Rugged notebooks run three to five times as much, and specialized models may be ten times as much.

By and large, the rugged computing industry has been doing a good job educating their customers to consider total cost of ownership as opposed to looking only at the initial purchase price, but it's not always an easy sell. And with handy, inexpensive tablets flooding the market, it isn't getting any easier. Add to that the fact that makers of rugged notebooks always had a special millstone hanging around their necks, that of having to make sure that products stay compatible with existing docks, peripherals and software. That often prevents them from adapting to new trends and switching to newer technologies and form factors (like, for example, wider screens) as quickly as some customers demand. While it's certainly nice to see Intel coming out with a new generation of ever-better processors every year or two, it's not making it easier for rugged manufacturers to stay current in technology and features either.

As is, if Itronix really had a roughly 12.5% market share, that slice of the pie is now up for grabs and it should be interesting to see who ends up with it.

Posted by conradb212 at 03:24 AM | Comments (0)

May 17, 2013

How Motorola Solutions made two mobile computers condensation- and freezer-proof

Good phone conversation today with the PR folks from Motorola Solutions. The occasion was the introduction of two interesting new products, the Omni XT15f industrial handheld, and the Psion VH10f vehicle-mount computer. The key here is the "f" in both of the names. It stands for "freezer" and that's what the two new devices are all about. Big deal?

Actually, yes. At least for workers who use their computers in and around freezers. That includes storage of perishable foods, the strictly temperature-controlled environments where medications are stored, and numerous other places for goods that need to be or stay frozen. So what's the issue? You just get devices that can handle the cold and that's it, right?

Yes, and no. While the environmental specs of most rugged computing devices include their operating temperature range, the range only tells the temperatures within which the device can be used. In the real world, and particularly when working around freezers, temperature alone isn't the whole issue. What matters is how a device can handle frequent, rapid changes in temperature. The real enemy then becomes condensation, and not so much temperature. Extreme temperatures remain an issue, of course, but one that must be addressed as part of the larger issue of rapidly changing temperatures.

So what exactly happens? Well, if you go from a hot and humid loading dock into a freezer, the rapidly cooling air in and around a device loses its ability to carry moisture, which then becomes condensation. That condensation then freezes, which can cause frost on displays, rendering them illegible, frozen keys on the keypad, and possibly internal shorts. When the worker leaves the freezer environment, the frost quickly melts, again affecting legibility of the display and possibly causing electrical shorts. It's quite obvious that extended cycling between those two environments not only makes the device difficult to use, but it's almost certainly going to cause damage over time.

Now add to that the slowing down of displays in extreme cold and the general loss of battery capacity, and it becomes obvious why this is an issue for anyone using mobile computers in those environments. And hence the new "freezer" versions of those two Motorola Solutions products (Omnii XT15f on the left, Psion VH10f on the right).

So what did Motorola do? Weber Shandwick's ever helpful Anne Norburg suggested I talk to the source and arranged the call, and so I had a chance to ask Amanda Honig, who is the Product and Solutions Marketing Manager for Enterprise Mobile Computing, and Bill Abelson of Motorola's media team. The overall challenge, they said, was to provide reliable "frost- and condensation-free" scanning. In order to do that, they had to address a number of issues:

Since the scan window can fog up, they used internal heaters to automatically defog the window, thus facilitating accurate scans under any condition.

Since external condensation can quickly freeze around keys and make the keypad difficult or impossible to operate, they designed special freeze-resistant keyboard layouts with larger and more widely spaces keys.

Since the airspace between the LCD display and the touchscreen overlay can fog up from condensation and make the display unreadable and imprecise to operate, they optically bonded layers to eliminate air spaces and used a heater to eliminate internal display fogging.

Since battery capacity tanks in very low temperatures and standard batteries can get damaged, they used special low temperature batteries with higher capacities and minimized performance loss at low temperatures.

And to make sure this all worked transparently and without needing any worker involvement, they included environmental sensors and heater logic circuitry so that the device automatically handles the rapidly changing temperatures and humidity. There are, however, also ways to do it manually.

Finally, since it makes no sense to overbuild, they offer two versions. One is called "Chiller" and is considered "condensation-resistant," with an operating temperature range of -4 to 122 degrees Fahrenheit. The other is called "Arctic" and is considered "condensation-free." That one can handle -22 to 122 degrees. The Chiller and Arctic versions add US$700 and US$1,100, respectively, to the cost of the basic Omni XT15 handheld computer. If it means fewer equipment hassles when getting in and out of freezers, that's a small price to pay.

There's another interesting aspect to all this. Changing and upgrading existing equipment is never easy, but in this case it was made easier because Psion, even prior to its acquisition by Motorola Solutions, had given much thought to modular design as a means to quickly and easily adapt to special requirements, easier maintenance, and also to future-proofing. At the very least this means much of the repairs and maintenance can be done in the field. And I wouldn't be surprised if it also made it easier to come up with these special versions

Posted by conradb212 at 11:19 PM | Comments (0)

May 14, 2013

Handheld: Pursuit of a vision

I had a chance yesterday to meet over dinner with Sofia Löfblad, Marketing Director at Handheld Group AB, and Amy Urban who is the Director of Marketing at Handheld US. I hadn't seen them since I presented at the Handheld Business Partner Conference in Stockholm three years ago, and it was a pleasure catching up in person.

The Handheld Group (not to be confused with Hand Held Products, which is now part of Honeywell) is a remarkable rugged mobile computing success story. Having its origins as a distributor of vertical market mobile computers from the likes of Husky, TDS and others, Handheld went on to establish its own identity with its own distinct product lines. In fact, they call themselves a "virtual manufacturer."

What does that mean? Well, while it is not unusual for larger distributors to resell OEM products under their own name, Handheld went one step farther. They not only have their own brands (Nautiz for rugged handhelds, Algiz for rugged tablets), but also their own design language and color scheme (Sofia even knew the precise Pantone color numbers), and they often have exclusive arrangements with their OEMs. So in addition to having a very cohesive brand identity and consistent look, Handheld products are less likely to immediately be identified by industry followers as rebranded versions of a common design.

How has that worked out for the Handheld Group? Apparently quite well. They now have ten facilities all over the world, as well as several hundred authorized partners. And they've been able to score impressive wins like a contract for 10,000 rugged handhelds with Netherland Railways against much larger competition.

They also proved their knack for coming out with the right product at the right time with devices such as the Algiz 10X (a rugged but light and handy 10-inch tablet), the Algiz XRW (a super-compact rugged notebook), and the Nautiz X1, which they call the toughest smartphone in the world. On the surface, that doesn't sound all that terribly exciting, but it really is, and here's why:

I am on record as bemoaning the demise of the netbook, those small and handy notebooks that used to sell by the tens of millions. Then they disappeared due to a combination of being replaced by consumer tablets, and, even more so, an unfortunate industry tendency to keep netbooks so stunted in their capabilities as to render them virtually useless for anything but the most basic tasks. Well, now that they are gone, many wish they could still get a small, competent Windows notebook that's tough and rugged, but isn't as big, expensive and bulky as a full-size rugged notebook. And that is the Algiz XRW. I've liked it ever since I took an early version on a marine expedition to the Socorro islands a couple of years ago (see Case Study: Computers in Diving and Marine Exploration. And the latest is the best one yet (see here).

The Algiz 10X likewise is a Q-ship (i.e. an innocuous looking object that packs an unexpected punch). On the surface, it's just a rugged legacy tablet, albeit a remarkably compact and lightweight version. And while that is mostly what it is, the 10X hits a sweet spot between old-style rugged tablet and new-style media tablet. One that will likely resonate with quite a few buyers who still need full Windows 7 and full ruggedness on a tablet and also some legacy ports, all the while also wanting a bright wide-format hi-res screen and a nice contemporary look.

Then there's the Nautiz X1 rugged smartphone, and that's a real mindblower. By now there are quite a few attempts at providing consumer smartphone functionality in a tougher package, but none as small, sleek and elegant as the Nautiz X1. It measures 4.9 x 2.6 inches, which is exactly the size of the Samsung Galaxy S2 (the one before Samsung decided to make the displays almost as big as a tablet). At 0.6 inches it's thicker, and it weighs 6.3 ounces, but for that you get IP67 sealing (yes, totally waterproof), a ridiculously wide -4 to 140 degree operating temperature range, and all the MIL-STD-810G ruggedness specs you'd usually only get from something big and bulky. Which the Nautiz X1 definitely is not.

In fact, with its gorgeous 4-inch 800 x 480 pixel procap screen, Android 4.x, and fully contemporary smartphone guts, this is the tough smartphone Lowe's should have looked at before they bought all those tens of thousands of iPhones (see here). Don't get me wrong—I adore the iPhone, but it's devices like the Handheld Nautiz X1 that belong in the hands of folks who use smartphones on the job all day long, and on jobs where they get dropped and rained on and so on.

I don't know if Handheld is large enough to take full advantage of the remarkable products they have. They've done it before with that big contract in the Netherlands. But whatever may happen, it's hard not to be impressed with their fresh and competent products that go along with their great people, and their fresh and competently executed business plan.

Posted by conradb212 at 03:59 PM | Comments (0)

April 24, 2013

Itronix RIP

Last week, as I came to a stop at a red light, a police car stopped in the lane next to me. What immediately caught my eye was an expertly mounted rugged notebook computer, angled towards the driver. It was a GD-Itronix rugged notebook, probably a GD6000 or GD8200, with an elegant matte-silver powder-coated insert on top of the magnesium alloy computer case that prominently featured the "General Dynamics" brand name. The officer perused the screen, then looked up, and briefly our eyes met. He had no idea how well I knew that computer in his car, and the one that came before it, and the one before that.

I began following Itronix in the mid-1990s when their rugged notebooks still carried the X-C designation that stood for "Cross Country." Around that time, Itronix purchased British Husky and with that came the FEX21, and since Windows CE was starting to come on strong in smaller rugged devices, Itronix also introduced the tough little T5200 clamshell. I remember a call with Itronix in 1996 or so when I was watching my infant son in the office for an hour or two while his mom was shopping. The little guy was not happy and screamed his head off the entire time I was on the phone with Matt Gerber who told me not to worry as he had a couple of young kids himself. I remember hoping he didn't think we were running a monkey operation.

Around the turn of the millennium, Itronix in a clear challenge to Panasonic's rugged, yet stylish Toughbooks, launched the GoBook. It was a clean, elegant, impressive machine with such cool features as a waterproof "NiteVue" keyboard with phosphorescent keys, and seamless, interference-shielded integration of a variety of radio options. I was impressed.

That first GoBook would quickly evolve into larger, more powerful versions and then spawn a whole line of GoBook branded rugged notebooks, tablets and interesting new devices such as the GoBook MR-1 that measured just 6 x 4.5 inches, yet brought full Windows in a super-rugged 2.5-pound package to anyone who needed the whole Windows experience in such a small device. On the big boy side came the impressive GoBook II, then III, and then "Project Titan," the incomparable GoBook XR-1. At the time we said that it had "raised the bar for high performance rugged notebooks by a considerable margin. It has done so by offering a near perfect balance of performance, versatility, ruggedness and good industrial design." High praise indeed, and totally deserved.

Itronix also branched out into the vehicle market with the semi-rugged GoBook VR-1 and into tablets with first the GoBook Tablet PC and then the GoBook Duo-Touch that combined both a touchscreen and an active digitizer into one small but rugged package. But even that wasn't all. With the introduction of the GoBook VR-2 came DynaVue, a truly superior new display technology that just blew my mind. Tim Hill and Marie Hartis had flown down from Spokane to demonstrate DynaVue on the new VR-2, and both could hardly contain their excitement. DynaVue ended up revolutionizing rugged systems display technology with a very clever combination of layering of filters and polarizers, and its approach became the basis of outdoor-viewable display technology still in use today.

I'll never forget a factory tour of the Itronix main facility in Spokane, meeting and speaking with some of the most dedicated engineers, designers, product planners and marketing people in the industry. I visited their ruggedness testing (I always called it "torture testing") lab which rivaled what I had seen at Intermec and at Panasonic in Japan. I spoke with their service people, the folks on the shop floor and with management. What a talented and enthusiastic group of people. The sky seemed the limit. (See report of the 2006 Spokane factory tour)

But change was brewing. Itronix's stellar performance had attracted suitors, and giant defense contractor General Dynamics, then a US$20 billion company with some 70,000 staff, felt Itronix would nicely complement and enhance its already massive roster of logistics, computing and military hardware. The sale had come as no surprise. Everyone knew it was eventually going to happen. Equity investment firm Golden Gate Capital had purchased Itronix in 2003 from former parent Acterna with the intent of prepping Itronix for a sale. Within just two years, Itronix prospered enough to make it a lucrative proposition for General Dynamics. Within Itronix, the hope was that the sheer mention of the name "General Dynamics" would open doors.

In our GoBook VR-1 review we cautiously offered that "the change in ownership will be both a challenge and a tremendous opportunity for Itronix."

Turns out we were right about the challenge part. The "GoBook" was quickly dropped in favor of a GDxxxx nomenclature, and with it the laboriously earned GoBook brand equity. There were attempts to somehow merge another GD acquisition, Tadpole, into Itronix, and that turned out to be difficult. No one seemed to know what to expect. And then the hammer fell.

In early 2009, General Dynamics announced it would phase out the Itronix computer manufacturing and service facility in Spokane, Washington and operate the business out of Sunrise, Florida where the company's C4 Systems division had an engineering facility instead. It was a terrible blow for Spokane, where losing Itronix meant the loss of almost 400 jobs. And the cross-country move meant Itronix lost most of what had made Itronix the vibrant company it had been.

It was never the same after that. On the surface things continued to look good for a while. There seemed a cohesive product line with GD2000, GD3000, GD4000, GD6000 and GD8000 rugged computing families. But from an editorial perspective, we were now dealing with people who didn't seem to know very much about the rugged computing business at all. And there no longer seemed a direction. Some of the final products were simply rebadged products from other companies. Eventually, there was mostly silence.

In January 2013, I was told that "after in-depth market research and analysis, we have determined that it is in the best interests of our company, customers and partners to phase out a number of our General Dynamics Itronix rugged computing products." In April 2013 came the end: "Itronix has phased out all products."

That's very sad. A once great company gone. Could it have been different? Perhaps. But Itronix was often fighting against the odds. Even in its heydays, Itronix primarily worked with Taiwanese OEMs whereas its major competitors at Panasonic and Getac controlled their entire production process. In addition, while its location in Spokane was a calming constant, Itronix ownership was forever in flux. Itronix was started in 1989 as a unit of meter-reading company Itron to make rugged handheld computers. It was spun off from Itron in 1992, then sold to rugged computer maker Telxon in 1993. In 1997, telecom testing gear company Dynatech Corp. bought Itronix from Telxon for about $65 million. Dynatech changed its name to Acterna in 2000, but fell on hard times and sold Itronix to private equity firm Golden Gate Capital in 2003 for just US$40 million in cash. Golden Gate held on to it for a couple of years before General Dynamics came along. -- The band Jefferson Starship comes to mind here, with Grace Slick charging "Someone always playing corporation games; Who cares they're always changing corporation names..."

Perhaps there could have been a management buyout. Perhaps the City of Spokane could have helped. But that didn't happen, and though in hindsight it seems like a natural, there are always reasons why things happen the way they happen.

As is, there once was a superbly innovative company called Itronix, and they did good. I will miss them, and so probably will everyone interested in rugged computing equipment. I bet the police officer I saw with his Itronix laptop will, too.

Posted by conradb212 at 06:52 PM | Comments (0)

March 27, 2013

Xplore adds Common Access Card reader-equipped rugged tablet for military and government

This morning, March 27, 2013, Xplore Technologies introduced a new version of their ultra-rugged tablet computer, the iX104C5-M2. In essence, this is a specialized model for military and government personnel that require additional hardware security on top of the various security hardware, software and firmware measures already inherent in modern computing technology. What the new M2 model adds is an integrated common access card (CAC) reader. With the reader, in order to get access to critical data, a U.S. government issued ISO 7816 smart card must be inserted.

Why is the ability to read such cards and to provide data access only with such a card important? Because it's mandated in directives and policies such as the Homeland Security Presidential Directive 12 (HSPD-12) that requires that all federal executive departments and agencies use secure and reliable forms of identification for employees and contractors. A chip in the CAC contains personal data, such as fingerprint images, special IDs and digital certificates that allow access to certain federally controlled systems and locations. As a result, both Federal agencies and private enterprise are now implementing FIPS 201-compliant ID programs.

But what exactly do all those card terms mean? What's, for example, the difference between a CAC and a PIV card, and how do they relate to Smart Cards in the first place? Well, the term "smart card" is generic. It's simply a card with a chip in it. The chip can then be used for data storage, access, or even application processing. A CAC is a specific type of smart card used by the US Department of Defense. A PIV (Personal Identification Verification) card is also a FIPS 201-compliant smart card used by the Federal government, but it's for civilian users. Then there's also a PIV-I smart card where the "I" stands for "Interoperable," and that one is for non-Federal users to access government systems.

The way a CAC works, specifically, is that once it's been inserted into the CAC reader, a PIN must be entered and the reader then checks via network connection with a government certificate authority server, and then either grants or denies access. The CAC stays in the reader for the entire session. Once it's removed, the session (and access) ends.

What this means is that only computers that have a CAC reader can be used for certain military and other governmental work. And the new Xplore iX104C5-M2 provides that reader. It's built directly into the chassis where it is secured and protected.

I had a chance to talk with Xplore Technologies representatives prior to the release of the new model. They said they created this new version specifically to meet the requirements of the Department of Defense, the Air Force, and Homeland Security. According to them, the potential market for CAC-equipped ruggedized tablet is 50,000-100,000 units. Considering that a rugged Xplore tablet lists for over US$5k, that would value that market at between half a billion and a billion dollars. Xplore's competition, of course, will step up to bat as well, and not all CAC-equipped computers will require the superior ruggedness and portability of an Xplore tablet,. But it's easy to see why Xplore, a company with roughly US$30 million in annual sales, would throw its hat in the ring. Even winning a small percentage of the estimated value of this market could have a sizable impact on Xplore.

While I'm at it, let me recap what Xplore Technologies is all about and what they have to offer. Unlike the flood of Johnny-come-latelies attempting to grab a slice of the booming tablet market, Xplore has been making tablets for the better part of two decades. Starting in the mid-1990s, the company began offering some of the finest and most innovative rugged tablet computers available. They were at the forefront with integrating multiple wireless options into a rugged tablet, offering truly outdoor-readable displays, and providing dual mode digitizers that automatically switched between active pen and touch. We reviewed their current iX104C5 tablet platform in detail a couple of years ago (see here) and declared it "one of the best rugged tablet designs available today." In the meantime, Xplore has broadened the appeal of the platform with a number of versions targeted at specific industries and clienteles, and this latest M2 version continues that tradition with a very timely product.

See the Xplore iX104C5-M2 product page.

Posted by conradb212 at 06:58 PM | Comments (0)

March 06, 2013

When the fire chief wants iPads instead of rugged gear

The other day I was engaged in a conversation at a party. Turns out my conservation partner was the fire chief of an affluent community of about 120,000. We talked about our respective jobs and soon found we had something incommon: rugged computing equipment. They use Panasonic Toughbooks, but the fire chief said something that has been on my mind for a while now. He said they liked the Toughbooks just fine, but he considered them much too expensive and they'd just buy iPads instead. He said he doesn't care if the iPads break, they'll just replace them with new ones because consumer tablets cost so little.

I can see that rationale. It's one thing if a professional tool costs 50% more than a consumer grade tool. But another if the professional tool costs five to ten times as much. Over the past few years I've seen large chains buy massive numbers of consumer smartphones and tablets instead of the rugged industrial-grade handhelds and tablets they used to buy. Sometimes it seems like the rugged computing industry is missing out on a great opportunity to benefit from the boom in smartphones and tablets by staying with older technologies and very high-end pricing instead of offering ruggedized versions of what today's consumers want.

Posted by conradb212 at 03:57 PM | Comments (0)

February 07, 2013

Not your father's Celeron

In my last blog article I wrote about the needless demise of netbooks, and how that demise was due more to the fact that people loved the rock-bottom price of netbooks but then found them too small and lacking in performance, so they asked for more size and performance. The industry complied with larger, more powerful netbooks, but that meant they cost more and netbooks weren't netbooks anymore. So people stopped buying them. I also wrote how, in my opinion, Intel's inexpensive Atom processors both made the netbook by making the low price possible, but then contributed to the demise of the netbook with their often unacceptable performance. Unfortunately, the unacceptable performance level of Atom processors also affected a lot of other industrial and vertical market devices based on Atoms.

So we have this unfortunate situation: Atom processors (of which there are by now about 50 different models) don't cost a lot, usually well under US$100, with some in the US$20 range. But they are also very marginal performers, so much so that not only netbook vendors abandoned them, but also a good number of vertical market manufacturers which quietly switched to "real" Intel Core processors. Unfortunately, even the low-end mobile Core i3 chips cost in the low US$200 range, and mobile Core-i7 chips usually closer to US$400. Those are huge price differences with major impacts on low-cost consumer electronics (though one would think far less impact on much higher priced vertical market computers where the processor makes for a much lower percentage of the overall purchase price).

What that all means is that there's an unfortunate gap between the inexpensive but rather underpowered Atom chips, and the beefy but much more expensive Core processors. (Oh, and while I'm at it, here's basically the difference between the by now three generations of Intel Core chips: First gen: good performance, but power hogs with insufficient graphics. Second gen: good performers with much better gas mileage but still sluggish graphics. Third gen: good performance, economical, and much better graphics.) But now to get back to the big gap between Atoms and Core processors: there's actually something in-between: Celerons and Pentiums.

Celerons and Pentiums? But weren't Pentiums old chips going back to the early 1990s and then being replaced by Core processors? And weren't Celerons bargain-basement versions of those old Pentiums? Yes that's so, but there are still Celerons and Pentiums in Intel's lineup, and they are NOT your father's Celerons and Pentiums, they are now slightly detuned versions of Core processors. They should really call them Core-i1 or some such.

But let me explain, because those new-gen Celerons and Pentiums may well be one of the best-kept secrets in the processor world. If you go to the Intel website and look up their mobile processor lineups, you'll find them neatly organized by generation and then by Core Duo, Core 2 Duo, i3, i5, and i7. Celerons are still listed as either Celeron M Processors or Celeron Mobile Processors. The Celeron Ms are old hat and many go back a decade or more. The Celeron Mobile processors, however, include many models that are much newer, with the Celeron 10xx low and ultra-low voltage models launched in Q1 of 2013, i.e. as of this writing. I would have never noticed this, and probably would have continued thinking of Celerons as obsolete bargain processors, had it not been for an Acer mini notebook I just bought as a replacement for my vintage (2008) Acer Aspire One netbook.

You see, my new Aspire One has an 11.6-inch 1366 x 768 pixel screen and is really still a netbook, with netbook looks and a netbook price (I bought it as a refurb for US$250), but it has a Celeron instead of an Atom processor. The 1.4GHz Celeron 877, to be exact, introduced Q2 of 2012, and an ultra-low voltage design with a thermal design power of 17 watts. It uses the Sandy Bridge architecture of the second gen Core processors, and reportedly costs about US$70, no more than a higher end Atom chip, and only about US$25 more than the Atom N2600. Now how would that work, a real Sandy Bridge, non-Atom chip in a netbook?

Turns out very well.

The Celeron-powered little Acer ran a 1,261 PassMark CPU score compared to Atom-powered devices in our rather comprehensive benchmark database reaching from a low of 163 (Atom N270) to a high of 512 (D510). The Celeron ran CrystalMark memory benchmarks between two and five times faster than the Atoms, and CrystalMark GDI benchmarks between three and five times faster. The Celeron 877 netbook also powered through most other benchmarks much faster than any Atom-based device. As a result, by netbook standard this new son-of-netbook absolutely flies. And it handles HD video, a big sore sport with early netbooks, without a problem.

But what about battery life and heat? After all, most of those mobile Atom chips have minuscule thermal design power of between two and five watts (with the exception of the D510, which is at 13 watts) whereas, though designated a "ultra-low power" chip, the Celeron's TDP is 17 watts. Reviews on the web complain about insufficient battery life of this particular Acer Aspire One (the AO756-2888). Well, that's because to keep the price low, Acer gave this netbook only a wimpy 4-cell 37 watt-hour battery. Most earlier netbooks had beefier 6-cell batteries.

In real life, our benchmark testing always suggested that Atom power management was relatively poor whereas ever since Sandy Bridge (second gen) Core processor power management has been excellent. So the difference between Atom and Core-based power consumption can be a lot less than one would assume based on TDP alone. And that was exactly what I found with the Celeron in this new Acer netbook. BatteryMon power draw (with WiFi on), was as little as 6 watts. That's actually LESS than what we have observed in a good number of Atom-powered devices (and also less than my old 2008 Atom N270-powered Acer netbook). Sure, the top end of the Celeron-based machine is so much higher that it can draw down the battery quicker than an Atom device, but under normal use, the Sandy Bridge guts of the Celeron handle power management very, very well. As for heat, my new little Acer has a quiet fan, but it actually stays cooler and the fan comes on less often than that in my 2008 Atom-based netbook.

I am not an electric engineer and so my conclusions about relative processor performance come from benchmarking, real life experience, and perusing Intel's tech specs. Based on that, I'd have to say the Pentium and Celeron versions of Intel's Core processors deserve a whole lot more attention in the mobile space. I don't know what it actually looks like at the chip level, but it feels like Intel starts with essentially one design, then adds features here and there (like all the extra Intel technologies in the Core i7 chips) and omits or perhaps disables those them in lower level chips. As a result, the inherent goodness and competence of an Intel Core chip generation may be available in those little-known Celeron and Pentium versions, if not all of the features of the more expensive SKUs.

What does that all mean? Obviously, for those who need to run the latest and most 3D-intensive video game at insane frame rates, only the very best and most expensive will do. And the same goes for those who absolutely need one or more of those extra features and technologies baked in or enabled in i5 and i7 chips. If that is not an issue, the Celeron versions may just be a very well kept secret and a terrific bargain. The Celeron 877 sitting in my lowly new netbook absolutely runs rings around any Atom-based device, and it does so without even trying hard and while treating battery power as the precious commodity it is.

So.... if I were a designer and manufacturer of vertical market industrial and rugged devices, I'd think long and hard before committing to yet another underpowered Atom chip that'll leave customers underwhelmed before long, and instead check what else Intel may have in its parts bin. There are some real bargains there, good ones.

Posted by conradb212 at 04:02 PM | Comments (0)

February 04, 2013

The needless demise of the netbook

Three or so years ago, netbooks sold by the millions. Today, they're gone, replaced by tablets and larger, more powerful notebooks. What happened? I mean, it's not as if tens of millions of people wanted a netbook a few years ago, and today no one wants one.

What's not to like about a small and handy notebook computer that runs full Windows and costs a whole lot less than even inexpensive larger notebooks? So much less that the purchase price of a netbook was close to making it an impulse buy.

The problem was, of course, that while the price was right, netbooks themselves weren't. Slowly running Windows on a very small display with marginal resolution quickly turned the netbook experience sour. The very term "netbook" implied quick and easy access to the web, an inexpensive way to be online anytime and anywhere. Well, netbooks were so underpowered as to make that browsing and online experience painful. It didn't have to be that way, but market realities painted the netbook into a corner where it withered and died.

It's not that the technology wasn't there to make netbooks fast and satisfying enough to become a permanent addition to what consumers would want to buy. And it wasn't even that the technology required to make netbooks as powerful as they needed to be without disappointing customers would have been too expensive. It's just that making such products available would have cannibalized more profitable larger notebooks. And consumers who demanded larger, more powerful netbooks at the same low price also weren't thinking it through.

There's a reason why compact technology demands a premium price. An unsubsidized 3-ounce smartphone costs as much as a 50-inch HD TV. A loaded Mini Cooper costs as much as a much larger SUV or truck. And ultra-mobile notebooks have always cost more than run-of-the-mill standard ones. It's the MacBook Air syndrome that dictates that sleek elegance and light weight costs extra. Netbooks broke that rule by promising the full Windows experience in an ultra-compact device at an ultra-low price.

You can't do that in the Wintel world. Something had to give. And that was acceptable performance. I would not go as far as declaring Intel's entire Atom project as a frustrating, needless failure as there are many Atom-based products that work just fine. But the whole approach of making processors not as good and fast as they could be but throttled and limited enough so as not to interfere with sales of much more expensive processors is fundamentally flawed. It's like promising people an inexpensive car, but then they find out it can't drive uphill.

So netbooks were flawed from the start in infuriating ways. The 1024 x 600 display format endlessly cut off the bottom of just about everything because just about everything is designed for at least a 1024 x 768 display. And that was the least of netbooks' annoying traits. Performance was the biggest problem. The Atom N270 processor in almost all early netbooks had painfully insufficient graphics performance, and was completely unable to play the HD video that people could generate on every cheap camera and phone. The endless wait for a netbook to complete any task beyond the very basics quickly turned people off. Yes, the small size and weight, the low cost, and the good battery life sold tens of millions of netbooks, but their inadequacy soon relegated them to the dustbin. In my case, I quickly realized that a netbook did not replace a larger notebook with standard performance; it just meant I had to take with me both the netbook AND the real computer.

So people demanded more. The original netbooks had 7-inch screens, but that quickly grew to 8.9 inches for all those Acer Aspire Ones and Asus Eee PCs. And then that wasn't large enough and so the netbook vendors switched to 10.1 inch screens. And then to whatever new Atom processors Intel introduced. Then tablets came and it was just so much easier, quicker and more pleasant to use a tablet to browse the web that the netbooks' shortcomings became even more evident.

With netbooks' fortunes waning but the iPad's tablet success turning out to be frustratingly difficult to copy, netbook vendors gave it one last shot. 11.6 inch screens with full 1366 x 768 720p resolution. AMD processors instead of Atom (short-lived and unsatisfactory). And finally ditching the Atom in favor of Intel Celeron and Pentium chips, which had little to do with the Celeron and Pentium M chips of yore but simply were wing-clipped version of Intel's Core processors. By doing that, netbooks ceased to be netbooks. They had become smallish notebooks with decent performance, but without the endearing compactness, small weight and rock bottom prices that once had given netbooks such allure.

And battery life suffered as well.Try as anyone might, it's just not possible to run a 11.6 inch screen and a 17-watt Celeron or Pentium for nearly as long on a battery charge as an 8.9-inch screen with a 2-watt Atom. So that quality of netbooks was gone, too.

Where does that leave all those folks who wanted a cheap and simple little notebook for when space, cost and weight mattered? Nowhere, really. Tablets are wonderful and I wouldn't want to be without mine, but they are not full-function computers. Not as long as real productivity software isn't available for them, and not as long as tablet makers act as if something as simple and necessary as being able to do or look at two things at once were the second coming. Fewer dropped calls, anyone?

So for now, if you peruse Best Buy or CostCo or Fry's ads, you either get a tablet or a notebook with a 14-inch screen or larger, or you spring for an expensive Macbook Air or an Ultrabook.

That leaves a big void, and a bad taste in the mouth. For the fact is that there could be totally competent netbooks in the impulse buy price range if it weren't for the reality that Intel makes all those pricey Core processors that all by themselves can cost several times as much as a basic netbook. Which means the myth that you need a real Intel Core processor to run Windows and not just some wimpy ARM chip must be upheld. Personally, I do not believe that for a second, but the financial fortunes of two major technology companies (Microsoft and Intel) are built upon this mantra, and that won't change unless something gives.

So what did I do when my little old 8.9-inch Acer Aspire One finally gave out? First despair because I couldn't find a contemporary replacement, then grudgingly accept the reality of the netbook's demise and buy a new Aspire One, one with an 11.6-inch 720p screen and a Celeron processor. I got a refurbished one from Acer because it was cheaper and comes with Windows 7 instead of Windows 8. So there.

But what if a low, low price is not the issue and you want something really rugged in the (former) netbook size and weight category? Then you get an Algiz XRW from the Handheld Group. It's small and light enough, runs forever on a charge thanks to using a little engine that for the most part can (the Atom N2600), and has a 720p screen good enough for real, contemporary work. And it's for all practical purposes indestructible.

Posted by conradb212 at 07:30 PM | Comments (0)

January 14, 2013

On the Microsoft front ...

Well, on the Microsoft side of things, a couple of areas are becoming a bit clearer. Not much, but a bit.

At the National Retail Federation (NRF) Annual Convention & Expo in New York, Microsoft issued a press release entitled "Microsoft Delivers Windows Embedded 8 Handheld for Enterprise Handheld Devices." That title is a bit misleading as those handhelds, prototypes of which were shown by Motorola Solutions, are not available yet, and Microsoft won't even release the Windows Embedded 8 Handheld SDK until later this year. However, after having stranded the vertical and industrial market with the by now very obsolete Windows Embedded Handheld (nee Windows Mobile 6.5) for a good couple of years, at least now it looks like Microsoft will offer a vertical market version of Windows Phone 8 for all those who want a handheld with a Microsoft OS on it instead of Android.

There will, of course, not be an upgrade path from Windows Mobile/Embedded Handheld to Windows Embedded 8 Handheld, just as there wasn't one from Windows Mobile to Windows Phone 7, or from Windows Phone 7/7.5 to Windows Phone 8. Still, at least having the prospect of soon getting an up-to-date mini Windows OS that's reasonably compatible with Windows 8 itself should be a huge relief to all those rugged handheld manufacturers who've been under increasing pressure of offering Android-based devices. Then again, Microsoft once again pre-announcing a product that doesn't even ship its SDK yet will also further perpetuate the uncertain vertical market handheld OS status quo, and likely lead to more customers deciding to simply get readily available consumer smartphones instead of waiting for the vertical market smoke to clear.

On the tablet side, we have the, by most accounts, less than stellar reception of Windows 8. Microsoft will likely correct the situation with Windows 8 over time, but as far as tablets go, it's pretty easy to draw some preliminary conclusions: Like, no matter how good the Windows Surface RT tablet hardware was/is, without being able to run what most people will consider "Windows" for many years to come, Windows RT is simply not going to fly. If the Metro interface were a runaway hit and there were tons of Metro apps, perhaps. But as is, anyone who needs to use any "legacy" Windows software is out of luck with Windows RT. So it's a Windows CE situation all over again: Windows RT must not be too powerful or else it'll eat into Windows 8 marketshare. And there can't be a perception that ARM-based tablets are capable of running "real" Windows, or else there'd be no reason to spend a lot more for Intel-based tablet.

Posted by conradb212 at 06:11 PM | Comments (0)

January 04, 2013

Big changes at General Dynamics Itronix

Eagle-eyed RuggedPCReview readers may have noticed something missing from the front page of our site: the General Dynamics Itronix logo in the site sponsor column. Yes, for the first time since the launch of RuggedPCReview, Itronix is not among our sponsors anymore. That's sad as Itronix was our first sponsor, and prior to that we had covered all those rugged Itronix GoBooks and other rugged mobile devices in Pen Computing Magazine since the mid-1990s.

What happened? We're not sure, but an email exchange with Doug Petteway, General Dynamics C4 Systems director of product management and marketing yielded that the company is "restructuring its portfolio of rugged products to focus more on high value targeted solutions rather than the mass commodity market" and that while they'll continue selling the GD6000, GD8000 and GD8200 rugged notebooks through early 2013, the entire rest of the lineup of Itronix rugged mobile computing products is discontinued.

Petteway made the following statement:

"At General Dynamics C4 Systems, we have a set of core capabilities that we are leveraging aggressively to expand and grow in key markets. To maximize our potential for success, we must continually assess and refine our portfolio, investing in critical gap-filling capabilities that enable us to deliver highly relevant “must-have” solutions while also phasing out offerings that are no longer in high demand, freeing up valuable investment resources.

After in-depth market research and analysis, we have determined that it is in the best interests of our company, customers and partners to phase out a number of our General Dynamics Itronix rugged computing products. This decision may affect the solutions customers buy from us today. Please know that General Dynamics C4 Systems’ management team wants to assure you that our customer needs remain our first priority.

As always, customer satisfaction is paramount and we will continue to ensure our customers receive the service and support in full accordance with our warranty commitments.

We remain focused on being an industry leader with proven, high value communications, computing, security and mobility solutions for our customers.

Additional announcements will be made in the near future."

That doesn't sound very good, and not having all those rugged Itronix notebooks and tablets available anymore is a big loss. We wish Itronix all the best, whatever course General Dynamics has in mind for them.

Posted by conradb212 at 12:06 AM | Comments (0)

November 30, 2012

Surface with Windows 8 Pro pricing contemplations -- an opportunity for traditional vendors of rugged tablets?

On November 29, 2012, Microsoft revealed, on its Official Microsoft Blog (see here), pricing for its Surface with Windows 8 Pro tablets. The 64GB version will cost US$899 and the 128GB version runs US$999. That includes a pen but neither the touch or the type cover. They cost extra.

So what do we make of that?

Based on my experience with the Surface with Windows RT tablet, I have no doubt that the hardware will be excellent. With a weight of two pounds and a thickness of just over half an inch, the Pro tablet is a bit heavier and thicker than the RT tablet, but still light and slim by Windows tablet standards. The display measures the same 10.6 inches diagonally, but has full 1920 x 1080 pixel resolution compared to the 1366 x 768 pixel of the RT tablet. That's the difference between 1080p and 720p in HDTV speak. There's a USB 3.0 port and a mini DisplayPort jack. Under the hood sits a 3rd Gen Intel Core i5 processor as opposed to the nVidia Tegra 3 ARM chip in the RT model. And both RAM and storage are twice of what the RT tablet has. All that certainly makes for an attractive tablet.

What customers of a Surface with Windows 8 Pro get is a modern and rather high performance tablet that can be used with a pen or a mouse in desktop/legacy mode, and with touch in the new Metro mode with all the live tiles and all. You can use the pen in Metro mode, of course, but Metro wasn't designed for that. And you can use touch in legacy mode, but as 20 years of experience with Windows tablets has shown, legacy Windows does not work well with finger touch. Still, this will most likely be good hardware that makes full Windows available in a tablet, and also allows evaluating Metro in its native mode.

But let's move on to the ever important price. And here Microsoft faced an unenviable task. Microsoft tablets had to be price-competitive with the iPad, and the Surface RT tablets are. Except that so far they have not been accepted as "real" Microsoft tablets because they cannot run legacy Windows software. The Windows 8 Pro tablets are real Windows tablets, but they now cost more than iPads. Sure, they have more memory and ports and a memory card slot and an Intel Core processor, but the perception will still be that they cost more than iPads and are thus expensive. That's somewhat unfair because the i5 processor in the Microsoft tablet alone costs costs more than most consumer Android tablets. But this is an era where you can get an impressive, powerful and full-featured notebook for 500 bucks or so, and a sleek Ultrabook for well under a grand. That makes the tablet look expensive.

Price, in fact, has always been a weak spot with Windows-based tablets. Witness a bit of tablet history: the first pen tablets in the early 1990s cost almost $4,000. Even in an era where notebooks cost much more than what they cost today, that was too much, and it was one of the several reasons why early pen tablets failed in the consumer market. Tablets did remain available in vertical markets throughout the 90s, albeit usually at around $4,000.

In 2001/2002 Microsoft tried again with their Tablet PC initiative. The goal there was to bring the tablet form factor, beloved by Bill Gates himself, to the business and consumer markets. The price was to be lower and to make that possible Microsoft initially mandated the use of inexpensive Transmeta processors. When they turned out to be too slow to drive the WIndows XP Tablet PC Edition at an acceptable clip, everyone turned to Intel and the average 2002-style Tablet PC ran around US$2,000. Which was still too expensive for the consumer market where customers could pick up a regular notebook for less.

Unfortunately, while two grand was too steep for consumers, the side effect was that companies like Fujitsu, Toshiba, and everyone else who had been selling tablets in the 90s now had to offer theirs for half as much as well, losing whatever little profit came from tablet sales in the process. What's happening now is that the Surface for Windows 8 Pro again halves the price people expect to pay for a tablet. And again there may be a situation where the public considers Microsoft's own Windows 8 tablets as too expensive while the verticals have to lower their prices to stay competitive with Microsoft itself.

And that won't be easy. Vertical market vendors have done a remarkable job in making business-class Windows 7 tablets available for starting at around US$1,000 over the past year or so. But those tablets were almost all based on Intel Atom processors which are far less powerful than what Microsoft now offers in their own Windows 8 Pro tablets. So we have a situation where Intel pushed inexpensive Atom processors to make inexpensive tablets possible, but Microsoft itself has now upped the ante for its licensees by offering much more hardware for less.

Ouch.

It's hard to see how this could possibly leave much room for the traditional makers of business-class Windows tablets. Unless, that is, they find a way to compellingly answer the one question we've been hearing ever more loudly over the past couple of years: "we need a tablet like the iPad, but it must run Windows and be a lot more rugged than an iPad." Well, there's the niche. Tablets that match the iPad's style and Microsoft's newly established hardware standard, but a whole lot tougher than either and equipped with whatever special needs business and industrial customers have.

That ought to be possible. The traditional vertical market tablet makers and sellers already know their markets. And unlike the designers of consumer market tablets, they know how to seal and protect their hardware and make it survive in the field and on the job. What that means is that Microsoft's pricing for their Surface tablets may well be a glass half full for the rugged computing industry, and not one half empty.

Anyone for a sleek yet armored ULV Core i5 or i7-powered, IP67-sealed tablet with a 1080p dual-mode and sunlight viewable procap/active pen input display, a 6-foot drop spec, dual cameras with a 4k documentation mode, 4G LTE, and integrated or modular scanner/RFID/MSR options?


Posted by conradb212 at 08:31 PM | Comments (0)

November 21, 2012

Windows RT: how suitable is it for vertical markets? (Part II)

I had planned a quick follow-up on my first impressions of the Microsoft Surface RT tablet and Windows RT in general. But now it's almost a month later, so why the hesitation?

It's not because of Microsoft's hardware. I am as impressed with the Surface RT tablet as I was when I first took it out of its box. It's a truly terrific device. If after a month of use about the only gripe is that you still can't easily find the on-off button, you know the hardware itself is good. So no issues there. It never gets hot or even warms up. Battery life is practically a non-issue, like on the iPad. It's plenty fast enough. Honestly, the argument that for real performance and real work you need an Intel processor is pretty thin. What it really feels like is that Microsoft is in the difficult spot of having to artificially hold ARM hardware back via Windows RT so that it won't compete too much with Intel hardware, but at the same time Microsoft doesn't want to come across as being uncompetitive on ARM platforms. Tough position to be in.

And then there's the whole concept of Windows 8. I really did not want to get into a discussion of operating systems, but Microsoft makes it hard not to. Especially if you've been covering Microsoft's various mobile and pen/touch efforts over the years.

One giant problem is that Microsoft still does not want to let go of the "Windows on every device" maxim. So Windows 8 is on the desktop, on notebooks, on tablets and on phones. With Microsoft claiming it's all the same Windows, though it's really quite unclear to most whether it's really the same Windows or not. So from a practical perspective, what exactly is the advantage of the tile-based "Metro" look on all those very different computing platforms when you really can't run the same software anyway? Yes, the fairly consistent look is probably good for brand identity (as if Microsoft needs more of that), but it's inconvenient for users who have to deal with this one-size-fits-all approach at best.

And there are some other issues.

For example, what's the deal with the "flatness" of everything in Windows 8 and RT? Not long ago everything had to be 3D and layered, and now everything has to be completely flat? There is simply no good argument for that. 3D components on a screen always help making things more manageable and more obvious (let alone better looking), so complete flatness for complete flatness' sake seems weak.

Then there's the peculiarly low density of almost everything I've seen so far in Metro. Maybe that's just because Metro is only getting started, but between the Kansas-like flatness and very little on the screen, it feels strange and empty, and it means a lot of panning left and right.

And by far the biggest beef: why try to shoehorn everything into one operating system? It is very abundantly clear that traditional Windows apps, the kind that hundreds of millions use every day, are simply not for touch operation and may never be. Just because it's simple to touch here and there and use touch to consume information on small media tablets doesn't mean touch is the way to go with the much more complex interactive software most people use for work. Pretty much all of the creative work I do, for example, requires the pinpoint accuracy of a mouse: editing, image processing in Photoshop, layout in Quark Xpress, etc., etc. I cannot see how that can be replaced by just tapping on a screen.

So from that perspective, it does seem like Microsoft has simply done what the company has done every time in the past 20 years when new and disruptive technology came along -- it paid lip service by putting a fashionable layer on top of Windows. That's what happened with Windows for Pen Computing (1992), the Pen Services for Windows 95 and then 98, and the Windows XP Tablet PC Edition (2002). Only this time the disruptive technology (tablets) has found widespread enough acceptance to really get Microsoft's attention.

And a couple of personal peeves in Windows RT:

First, I find the live tiles annoying. I find all the constant moving on the screen distracting, and in corporate environments it's certainly a constant distraction, with people getting sidetracked into consuming information. Let me make the decision what I want to do next, rather than have a screen full of tiles vying for my attention like a wall of alive pictures in a Harry Potter movie.

Second, if Metro is indeed Microsoft's interface and operating environment of the future, does that mean we'll have come full circle from having just one app per screen to task switching to, finally, software that allowed as many windows as we wanted, just to get back to task-switching one-thing-at-a-time? That, given the right apps, may be good on small tablets, but it's definitely not the way I'd want to work on the desktop or even on a laptop.

Oh, and a third... if Microsoft is concerned about being so far behind with available apps in its store, it really doesn't show. If they were concerned, why would the store be as ultra-low density as it is, with no way of quickly finding what you really want? The store interface seems minimal beyond a fault.

But on to Windows RT and its suitability for vertical markets. That actually might work, although there are several big ifs.

Windows RT for vertical markets: PRO

Economical hardware -- Judging by the initial Surface RT tablet, ARM-based Windows RT-powered tablets could be a perfect solution for numerous vertical market deployments. They are light, simple, quick, don't heat up, get superior battery life, and they cost less.

No virus/malware -- User don't have to worry about viruses and malware because a) the main focus of the bad guys will remain Windows 8 proper, and all software must come from the Microsoft app store. That could be a big argument for Windows RT.

Device encryption -- There's device level encryption In Windows RT. That can be done in Windows 8 also (via BitLocker and other utilities), but in Windows RT it's in the OS itself.

Custom stores --From what I hear, vertical market vendors will be able to have their own showrooms in the Microsoft store that only users of that vendor's hardware can see. That would/will be a great benefit for both users and vendors.

Microsoft Office -- Microsoft Office comes with Windows RT. I haven't done a feature by feature comparison with "real" Office and there are those who says Office RT is a dumbed-down version of Office. All I can say is that Office RT will meet the needs of a whole lot of users. If it's dumbed down, it's infinitely less dumbed-down than Office on Windows CE and Windows Mobile was. There are, however, some licensing issues as, at least for now, Microsoft considers Office RT not for commercial use.

Legacy and leverage -- Microsoft has always used the leverage argument ("your users and programmers already know Windows, and this will fit right in") , and Windows RT will probably benefit from that as well. It's curious how much of the age-old Windows utilities and apps actually run on Windows RT, and Windows RT will probably fit much more easily into a corporate Windows infrastructure than Android or iOS.


Windows RT for vertical markets: CON

Confusion -- You'll forever have to explain (and wonder) what exactly works and what doesn't work on Windows RT compared to Windows 8. Some may decide it's easier to just use Windows 8 instead.

Still not pure tablet software -- Unlike with Android and the iPad, Windows RT users still have to fall back into desktop mode for Office and perhaps other functionality (settings, configurations, etc.) where touch just doesn't work well and you really need a mouse. You can use any USB mouse with Windows RT, but it's frustrating to never know if you need a mouse on your new tablet or not.

Artificial limitations -- Since Windows RT is not to compete too much with the Wintel side of Windows 8, there are hardware and software limitations to deal with in Windows RT, whether they make sense or not. Users are the victims here.

Vendor predicament -- How is a hardware vendor to make the call on Windows 8 versus Windows RT? Offer both? Make cheaper RT versions? That's exactly the kind of predicament vendors used to have with Windows versus Windows CE (CE lost).

So for now, as far as the suitability of Windows RT for vertical markets goes, I'll have to give an "A" for current Windows RT tablet hardware. It's really excellent, and ARM-based hardware could really be a boon for integrators and vertical market vendors; a "B-" for Windows RT itself, because for now Metro is too limited to be of much use; and a "D" for clarity of concept as it's totally unclear where Microsoft is headed with RT.

Posted by conradb212 at 05:30 PM | Comments (0)

October 27, 2012

Windows RT: how suitable is it for vertical markets? (Part I)

Though as of this writing, October 27, 2012, Windows 8 and RT were just officially unveiled a couple of days ago, reams have already been written on Windows 8 and to a much lesser extent, Windows RT. We got our Surface RT tablet on October 26 with the intent on reporting on the Surface hardware and RT software in some detail. However, our emphasis will be on their suitability for vertical and industrial markets.

So what about Windows RT? The general word on it has been that it's a special version of Windows 8 for devices with ARM processors. A special version that will not be able to run any legacy Windows software, one that does not offer users the legacy desktop to go about their Windows business, and one where you cannot install software other than download it from the official Windows store. Engadget clearly stated in its review of Windows Surface: "Windows RT can't run legacy programs written for traditional, x86-based Windows systems."

Is this all so?

Yes, and perhaps no.

So here's what we found so far on our Surface tablet.

It comes with Microsoft Office 2013, and you run those versions of Word, Excel, PowerPoint and OneNote on the Windows RT desktop. We took screen shots of Word, Excel and PowerPoint, and here's what the apps look like (click on the pics for full-size versions):

Note that Office RT isn't final yet. It'll be a free download when it is. From what I can tell (and I am not an Office expert), even what comes with Windows RT now is a full version of Office, and not some micro version like Windows CE/Mobile used to have. This is the real thing.

Anyone who expected Office to be totally touch-optimized for Windows RT will be disappointed. It's not. You can use it with touch, but it can be a frustrating experience. And the touch keyboard doesn't help. Fortunately, you can simply plug in any old mouse or keyboard or mouse/keyboard combo and it works with Windows RT right off the bat.

Below is a screen capture of an Excel presentation. (and yes, I picked the slide that shows Alan Kay predicting it all back in 1968, and the original IBM Thinkpad tablet from 1993 on purpose).

If you take a closer look at our Word and Excel screen captures, you'll notice that not only are they in their own windows, we also have legacy Windows apps like Paint, Notepad, Calculator, the Math Input Panel, a system shell and the old performance monitor running. Interestingly, they do run (and many others, too, like Remote Desktop, Windows PowerShell, the whole Control Panel, etc.), and you can even pin them on the old Windows task bar. In fact, there's a lot of old legacy Windows stuff down in the basement of Windows RT. And much of it seems as functional as ever.

I am not sure what to make of that. After all, Windows is not supposed to run on ARM, yet a good number of the old programs do run. There's probably a good explanation for that.

Unfortunately, that doesn't mean you can simply install and run other old software. If you do, there's a message that says you can only install software from the Windows store.

So what's our preliminary impression of Windows RT on a Surface tablet? Quite positive. The 1.3GHz quad-core Nvidia Tegra 3 CPU has plenty enough power to make RT tablets perform well. The Nvidia setup doesn't need a fan and the tablet never even warms up, at all. And it seems to run almost ten hours on a charge.

Check back for more commentary on the suitability of Windows RT hardware and software for vertical markets.

Posted by conradb212 at 06:19 PM | Comments (0)

October 16, 2012

Windows Surface tablets will be here shortly

Now this should be interesting. On October 16, 2012, Microsoft announced more detail on its upcoming Windows Surface tablets. And though labeled as a "pre-order" with limited amounts, customers could actually order the Windows Surface RT tablet of their choice from the Surface page on Microsoft's online store. For delivery on or before October 26th, i.e. within ten days.

So the pricing of the Microsoft Windows RT tablets is no longer a secret. The basic 32GB tablet without a keyboard touch cover is US$499, the touch cover adds a hundred bucks, and the 64GB version with touch cover is US$699. That gets you a Microsoft-branded tablet that's as slender as the iPad, though it weighs a tiny bit more (1.5 vs 1.44 pounds). The Microsoft tablet looks wider because its 10.6-inch screen has a wide-format 16:9 aspect ratio compared to the iPad's 4:3.

There's a standard USB port (which the iPad doesn't have) and a standard microSD card slot (which the iPad also doesn't have). There's a capacitive touch screen of course, and two 720p cameras, meaning the Surface tablet is for video and not so much for taking pictures (for that you'd want higher res). The 1366 x 768 pixel resolution is more than the original iPad and the iPad 2's 1024 x 768, and it's also what's called 720p in HDTV and video speak, so it should be good for video playback.

All the expected sensors are there: ambient light, accelerometer, gyroscope and compass, meaning the Surface will be able to do the same tricks customers have come to expect from modern apps. And speaking of apps, the Surface RT tablet comes with Microsoft Office Home and Student 2013 RT (see here). It's not the final, final version, but it'll be a free update when that becomes available.

There's WiFi and Bluetooth, but no mobile broadband, so these initial versions of Microsoft's RT Surface tablets will need to be within the reach of a WiFi access point to be online. The processor is of the Nvidia Tegra variety, i.e. the type that has been powering the majority of Android tablets out there.

What's new and different is Windows RT, a version of Windows that runs on ARM processors and doesn't need the presumably more complex x86-based hardware required to run full Windows. What exactly that means remains to be seen. It's said that the Surface RT tablets are aimed at the consumer market, but the iPad was, too, and now it's used almost everywhere. How exactly will Windows RT work? How will it resonate with customers who have come to expect elegant, effortless simplicity from tablets? No one knows just yet.

And how will it all relate to Surface tablets with full Windows 8, tablets that will, at least the Microsoft Surface versions, look very much like the Surface RT tablets, but have beefier hardware (anything from the latest Atom to third gen Core processors), higher resolution (1920 x 1080), and more storage? Will the two co-exist, with users selecting one or the other depending on their needs? The Windows 8 Pro versions will inevitably cost a good bit more, but how much more can a market bear where consumers have been spoiled with very inexpensive, very powerful notebook computers for years? Much will probably depend on how Windows 8 pans out.

Finally, what will it all mean to vertical and industrial market tablets? Will there be rugged tablets running Windows RT? Or will the ever-important leverage factor dictate that most enterprise and industrial tablets remain x86-based and compatible with legacy Windows? No one knows.

So for now I ordered a Surface RT tablet, just to see how it works and what it's all about.

Posted by conradb212 at 11:28 PM | Comments (0)

October 02, 2012

Motorola Solutions' acquisition of Psion: Good, bad, or ugly?

Well, it's done. Psion is now part of Motorola Solutions. On October 12th, 2012, Ron Caines and Frederic Bismuth of Psion and Mark Moon of Motorola Solutions sent out the following note to their customers:

Dear Psion Customer:

We are writing to let you know that today Motorola Solutions completed the acquisition of Psion PLC.

Motorola Solutions is a leading provider of mission-critical communication systems and a pioneer in enterprise mobility solutions. The company has always been focused first and foremost on how to best serve its customers and chose to acquire Psion because of its complementary enterprise mobile computing products and its talented people who understand this highly specialized business. We are excited about what this opportunity brings you as a valued Psion customer. Bringing the Psion family of products onboard allows Motorola Solutions to extend its portfolio and better serve customers by delivering solutions in expanded use cases, especially in warehousing, cold chain, ports, yards and specialized modular applications.

Integration of the two companies has only just begun today. There will be no immediate changes to your account management, the partners that serve you or the products and services you receive from Psion. Customers who previously purchased or will purchase Psion products can be assured their products will be fully serviced and supported for the full duration of the contracts. All customer support numbers also remain the same.

Furthermore, Motorola Solutions is committed to investing jointly around its and Psion's technical strengths and capabilities to deliver compelling solutions for the various applications and markets that both Motorola Solutions and Psion have served.

Once we have worked through the details of the integration, we will share those plans with you. You can be assured that throughout this process we will remain focused on building on Psion's relationship with you and serving all of our customers.

If you have any questions, please contact us or your Psion representative. Thank you for your continued loyalty and support.

With many of the smaller, independent manufacturers of rugged computing equipment being swallowed up by larger companies, this was perhaps inevitable. To many rugged computing enthusiasts and insiders, also inevitable is the question "why?" as there is rather substantial product line overlap between the two companies. In an informal conversation, a Motorola source said that the acquisition of Psion was adding complementary handheld products and vehicle-mount terminals that complement Motorola's offerings. The acquisition, the source said, also supports their international growth strategy by providing an attractive, global-installed base.

That's certainly true, by if the history of such acquisitions has shown anything, it's the latter reason rather than the former. As is, purchased product lines almost inevitably get absorbed. They may live on for a while, but in the longer run it makes no sense to carry duplicate lines. That's too bad as Psion was really on to something with their modular approach to rugged handheld computing platforms. What will become of the innovative ikôn, Neo, and Omnii? The tough WorkAbouts? The panels that still have the old Teklogix' DNA?

So for now, we reflect on what was. Through Pen Computing and RuggedPCReview.com we covered Psion for a very long time. First those really terrific little clamshell handhelds that were better than anything based on Windows CE at the time, then the acquisition of Teklogix in 2000 (I was at the press conference in Chicago when it was announced), the Psion netbooks way before the world bought tens of millions of "netbooks," and always the rugged handhelds. We had a close relationship with Psion most of the time; at some point we even had a "Psion PSection" in Pen Computing Magazine (with some of the columns still online at pencomputing.com/Psion/psection.html).

So here's hoping that Moto Solutions will aim for, and succeed in, creating the synergy that is always given as the reason for an acquisition. After all, Moto's own for Symbol Technologies is well aware of the good (its own flourishing after being acquired by Moto), the bad (Intermec > Norand), and the ugly (Symbol > Telxon).

Posted by conradb212 at 03:11 PM | Comments (0)

August 31, 2012

"The Windows Marketplace for Mobile for windows mobile 6.x devices is closing"

"The Windows Marketplace for Mobile for windows mobile 6.x devices is closing" -- that was the title of a March 8, 2012 entry at answers.microsoft.com. In it, it said among other things, "Beginning May 9, 2012, the Windows Mobile 6.x Marketplace service will no longer be available. Starting on this date, you will no longer be able to browse, buy or download applications directly on your Windows Mobile 6.x phone using the Windows Mobile 6.x Marketplace application and service." Signed The Windows Phone Team (with "Ready for a new phone? Explore the latest Windows Phones -- now with over 60,000 applications and games available!" in their signature). I mean, the fact that the announcement was made by the Windows Phone team, whose job it is to replace Windows Mobile, and not whoever is responsible within the Windows Embedded contingent tasked with presiding over Windows Embedded Compact speaks volumes.

Good Grief.

What was Microsoft thinking? The one saving grace of what's left of Windows Mobile or Windows Embedded Compact, or whatever it's called these days, was the Windows Marketplace from which you could download apps directly into the device. Whenever I got a new Windows Mobile device for testing, the first thing I always did was download a few essentials, such as Google Maps, Bing, Facebook, Handmark's ExpressNews, a couple of utilities and converters, etc. Now you can't even do that anymore.

It's as if Microsoft (or whatever feuding faction within Microsoft presides over the demise of Windows Mobile these days) had dropped even the last ounce of pretense that they intend to maintain Windows Mobile as a viable contender to iOS and Android. Windows Mobile never was that, of course, but the nicely done Marketplace at least let long-suffering users personalize their devices to some extent. No more.

That is truly regrettable. I don't think anyone ever loved Windows Mobile, but fact is that even today, in 2012, the vast majority of industrial and vertical market mobile hardware still runs one version of Windows Mobile or another. By ditching the Marketplace, Microsoft now has made sure that Windows Mobile devices are truly usable only via 100% custom-designed software that mostly avoids the OS interface altogether.

That is not a happy situation for all the rugged hardware vendors who have faithfully designed, manufactured and marketed innovative, reliable, high quality devices for all those years, and now are saddled with an ancient software platform that is neither supported properly by Microsoft, nor competitive against newer platforms, even those incompatible ones from Microsoft.

Posted by conradb212 at 04:56 PM | Comments (0)

August 11, 2012

Performing under pressure

As I am writing this, the London Olympic games are coming to an end. What two weeks of intense competition proved again is that winning means meticulous preparation, at times a bit of luck, and always the ability to perform under pressure. The latter made me think because rugged computers are all about the ability of a piece of equipment to perform under pressure. Pressure as in heat, cold, dust, rain, sun, and whatever else may keep a system from running at peak efficiency.

Ruggedness testing is designed to determine if systems hold up under pressure, but are the tests really meaningful? Many probably are. If, for example, a system is dropped a number of times from a certain height and still works afterwards, chances are it'll survive similar drops out there in the field. But are all tests as meaningful?

A while ago a manufacturer of rugged computers challenged us to test computing performance not just in an office environment, but also over the entire listed operating temperature range. We did, and not surprisingly, the machinery supplied by that company passed with flying colors, i.e. it ran through the benchmarks as fast at freezing and near boiling temperatures as it did at the 72F we usually have in the test lab.

But, as we subsequently found out, that seems to be the exception. We've been doing benchmark testing on some other rugged devices under thermal stress, and the results are reason for concern. If a rugged handheld, laptop or tablet is supposed to be used out in the field, it's reasonable to assume it'll be asked to perform at peak efficiency at temperatures one might likely encounter outdoors or on the job. Depending on where you are, that might easily include temperatures well over 100 degrees. Such work may well include prolonged exposure to the sun where it may heat up beyond ambient temperature. If it is 105 degrees outdoors, temperatures may easily reach 115 or 120 degrees or even higher if you sit the device down somewhere, or even if it's left in a car. So what happens to performance then? Can the device perform under pressure?

Turns out, not all can.

Running our standard benchmarks after leaving rugged systems out in the California summer sun showed performance drops of 50 to 80%. That's pretty serious. Is it acceptable that a piece of equipment that's supposed to be used outdoors then runs only at a fraction of the speed or even at half speed? I'd say not. Think of the potential consequences. Tasks may take between twice to several times as long, potentially affecting critical decisions.

Is it reasonable to expect full performance under extreme conditions? Not necessarily. Extreme conditions can have an impact on electronics, and there may be justifiable, reasonable precautions to limit performance so as to safeguard the equipment and its life. But is it acceptable to see performance drop to a fraction at the limits of a listed operating temperature range? It's not. Customers should know what level of performance they can expect when the going gets tough.

Like at the Olympics, performance under pressure separates the rugged system winners from the also-rans. This really needs to be addressed.

And it's not a simple issue. Complex electronics such as processors have sophisticated internal power management. Boards have sensors that report temperatures to control mechanisms that then may throttle system performance. Firmware and the OS may also monitor environmental situations and then engage fans or throttle performance. The hardware itself may have inherent design limitations. Variables such as Glass Transition Temperature, or Tg, come into play. Tg is the temperature at which polymer materials go from a glassy state to a rubbery state. The types of capacitors used matters. Conformal coating can protect boards. HALT testing can predict real life reliability better than the simple mean time between component failures. And so on.

All of this is standard practice in embedded systems design. It should be fully and universally adopted in rugged mobile system design as well.

Posted by conradb212 at 04:46 PM | Comments (0)

June 26, 2012

Microsoft's entry into tablet hardware a result of partner failure?

Ever since Microsoft provided a glimpse at a couple of "Surface" tablet hardware prototypes, some in the media are describing Microsoft's apparent entry into the hardware market as a result of Microsoft hardware partner failure. As if, somehow, the combined might of the world's computer manufacturers failed to come up with tablet hardware good enough to do Windows justice.

Nothing could be farther from the truth.

The reason why Windows-based tablets never were a major commercial success lies squarely in Microsoft's corner, and not in that of the hardware partners. For stating the very obvious: Windows has never been a tablet operating system. It was designed for use with a keyboard and a mouse. It does not work well with touch, and it did not work well with pens.

If anything, hardware partners went out of their way with innovative ideas and products to make Windows work in Microsoft-mandated tablets. And let's not forget that it was Microsoft itself that, well into the lead-up to the 2002 Tablet PC introduction, began pushing convertible notebooks rather than tablets. Apparently, the company had so little faith in its own Tablet PC project that it seemed safer to introduce the Tablet PC Edition of Windows XP on a notebook with a digitizer screen rather than a true tablet. That of course, made tablet PCs bigger and bulkier and more expensive.

Let's also not forget that Microsoft mandated an active digitizer for the 2002 Tablet PC because active pens better emulated the way a mouse (and with it, Windows) worked. Touch was definitely not part of the Tablet PC.

Microsoft's hardware partners did the absolute best they could within the great constraints of the Windows OS. In the 1990s, companies like GRiD, Fujitsu, Toshiba, NEC, IBM, Samsung, Compaq and many others came up with numerous tablet computer solutions trying to somehow make Windows work in smaller, lighter, handier platforms without physical keyboards. In the 2000s, a whole roster of hardware partners came up with tablet and tablet convertible hardware when Bill Gates proclaimed that by 2006, tablets would be the most popular form of PCs in America. They (Motion Computing, Fujitsu, Acer, Toshiba, Panasonic, etc.) invested the money and they carried the risk, not Microsoft.

Add to that the unsung heroes of the tablet computer form factors, the companies that made all those vertical market tablets for applications where it simply wasn't feasible to carry around a big laptop. They made do with what they had on the operating system side. And they did a remarkable job.

To now complain about "partner failures" is simply asinine. And given that even now, hardware partners will have to decide whether to bet on x86 Windows 8 or ARM Windows RT, will they again be blamed if one or both flavors of Windows 8 fail to make inroads against the iPad and Android tablets?

Posted by conradb212 at 10:47 PM | Comments (0)

June 21, 2012

Windows Phone 8...

Sometimes I wish I could be a fly on the wall to listen in when Microsoft's mobile folks make their decisions.

I mean, a few years ago they found themselves in a position where, against all odds, their erstwhile omnipotent foe Palm collapsed and left Windows Mobile as the heir apparent. So did Microsoft take advantage of that? Nope. Instead, they failed to improve their mobile OS in any meaningful way, all the while confusing customers by endlessly renaming the thing. And handing leadership over to the phone companies.

Then Apple comes along and shows the world how smartphones are supposed to be. Well, apart from grafting a Zune-like home screen, Microsoft did virtually nothing to advance Windows CE from its mid-1990s roots. Then they come up with Windows Phone 7, which is a whole lot better, but completely incompatible with any earlier Windows CE/Windows Mobile devices and software.

While Phone 7 and the Phone 7.5 update were billed as the future, apparently they weren't as now there will be Windows Phone 8, which is.... completely incompatible with Phone 7/7.5. And why? Because Phone 8 will supposedly share the same Windows kernel that "real" Windows has (though presumably not the ARM versions). So if Windows 7/7.5 still had Windows CE underpinnings, why were those versions not compatible at all with earlier Windows CE/Windows Mobile versions? It's just all so confusing.

And about the shared Windows kernel: Wasn't the very idea of Windows everywhere why Windows failed in so many areas that were not desktop or laptop?

In this industry, one absolutely never knows what's going to happen. Palm was considered invincible, Transmeta was supposed to succeed, Linux was to be the next big thing, the iPhone and then iPad were widely derided as lacking and a fad when they were first introduced, and Android was certain to quickly challenge iOS in tablets. So perhaps Windows Phone 8 will somehow become a success, but then why baffle the public with Windows 8 for the desktop, Windows RT, which isn't quite Windows, for ARM tablets, two versions of "Surface" tablets, and then Windows Phone 8 devices that share the Windows kernel but are somehow separate anyway?

Go figure.

Posted by conradb212 at 08:19 PM | Comments (0)

May 30, 2012

Android finally getting traction in vertical and industrial markets?

Just when Windows 8 is looming ever larger as perhaps a credible competitor to iOS and the iPad, we're finally starting to see some Android action in vertical market tablets and handhelds. It's timid, exploratory action still, but nonetheless a sign that the industry may finally break out of the stunned disbelief as Apple was first selling millions and then tens of millions of iPads.

What has changed? Perhaps it's the fact that it's becoming increasingly harder to argue against Android as a serious platform now that Google's OS dominates the smartphone market. Though it seems more fragmented than ever, Android is now on hundreds of millions of smartphones, and all of them are little mobile computers much more than phones. The fragmentation is certainly an issue as is the large variety of mobile hardware Android runs on, but it's also a trend and sign of the time. Cisco recently published the results of a study which showed that 95% of the surveyed organizations allowed employee-owned devices, and more than a third provided full support for them. It's called the "Bring Your Own Device" syndrome, and for Cicso it was enough to ditch its own Cius tablet hardware. What it all means is that people will want to use what they own, know and like, and in tablets and handhelds that's iOS and Android.

There's also been movement on the legal front. Oracle had been suing Google for patent infringement over some aspects of Android, and since Oracle is a tenacious, formidable opponent in whatever they tackle, this cast a large shadow over Android. Well, Google won, for now at least, when a jury decided Google had not infringed on Oracle's patents.

So what are we seeing on the Android front?

Well, there's DRS Tactical Systems that just announced two new rugged tablets with 7-inch capacitive touch displays. They look almost identical, but they are, in fact, two very different devices. One runs Android, one Windows, and DRS made sure the hardware was fully optimized for each OS, with different processors, different storage and different controls. That's costly, and it shows that DRS sees Android as having just as much of a chance to be the platform of choice in mobile enterprise applications as does Windows.

There's Juniper Systems which revealed that its unique 5.7-inch Mesa Rugged Notepad will soon be available in an Android version called the RAMPAGE 6, courtesy of a partnership with Pennsylvania-based SDG Systems. The Juniper Mesa is powered by the ubiquitous Marvell PXA320 processor. If the Android version uses this same chip, we'd finally have an answer to the question whether the PXA processors that have been driving Pocket PCs and numerous industrial handhelds for a decade can run Android (we asked Marvell several times, to no avail).

The folks at ADLINK in Taiwan have been offering their TIOT handheld computer in two versions since late 2011; the TIOT 2000 runs Android, the identical-looking TIOT 9000 Windows CE. Here, though, the Android model runs on a Qualcomm processor whereas the Windows CE model has a Marvell PXA310.

General Dynamics Itronix has been playing with Android for a couple of years now, demonstrating their Android-based GD300 wearable computer to military and other customers. Panasonic introduced their Toughpad to great fanfare at Dallas Cowboy Stadium in November of 2011, but though the rather impressive tablet seemed ready back then, it actually won't start shipping until summer of 2012. Motorola Solutions also announced an Android tablet late in 2011, but I am not sure if the ET1 Enterprise Tablet is in customer hands yet.

Mobile computing industry veterans may recall that there was a similarly confusing era several technology lifetimes ago: back in the early 1990s the upstart PenPoint OS platform came on so strong that several major hardware companies, including IBM, shipped their tablets with PenPoint instead of Microsoft's unconvincing pen computing overlay for Windows. Microsoft, of course, eventually won that battle, but Microsoft's "win" also demoted tablets back into near irrelevance for another decade and a half. Will it be different this time around? No one knows. Microsoft dominates the desktop, as was the case back then. But unlike PenPoint which despite its hype was known only to a few, hundreds of millions are already familiar with Android.

The next six months will be interesting.

Posted by conradb212 at 10:10 PM | Comments (0)

May 02, 2012

The widening gulf between consumer and vertical market handhelds

Almost everyone has a smartphone these days. Smartphones are selling by the tens of millions every quarter. In Q1 of 2012, Apple and Samsung sold over 30 million smartphones each. Smartphones have become part of modern life. Everyone is tapping, pinching and zooming. Everyone except those who need a rugged smartphone. Because there isn't one.

Now to be fair, there are rugged smartphones and any number of ruggedized handhelds that add phone functionality to a handheld computer that can also scan and do all the things people who work in the field need to do on the job. Except, they really aren't smartphones. Not in the way consumers have come to expect smartphones to be. Why is that?

Because ever since 2007 when Apple introduced the iPhone, there's been a widening gulf between consumer phones and the devices people use at work. Before the iPhone, cellphones had a bit of rudimentary web functionality and a number of basic apps. Nothing was standardized and everyone rolled their own. Professional handhelds almost all ran Windows Mobile, which had had very good phone functionality as early as 2002. But Windows Mobile never really took off in the consumer market.

Why did the iPhone change everything? Because it introduced a fluid, elegant way of using and interacting with the phone that resonated with people and made total sense. Almost no one wants to first pull out a plastic stylus to then operate a clumsy mini version of a desktop OS. But just lightly tapping at a screen, drag things around, and effortlessly zoom in on what was too small on a tiny phone display, that's an entirely different story. One that Google quickly copied with Android, and one that Microsoft did not, or not until it was too late.

As a result, smartphones took off on a massive scale, one much grander than anyone had anticipated. And it was the sheer, simple elegance and functionality of just having to lightly tap, swipe, pinch and zoom that did it. Which, in turn, came from Apple's primary stroke of genius, that of using capacitive multi touch.

The rest is history. Since 2007, Apple's sold hundreds of millions of iPhones. And there are hundreds of millions of Android smartphones, with vendors selling Android-based smartphones combined having a larger market share than Apple.

With all of this happening and perhaps half a billion handhelds being sold in just five short years, how did the vertical market respond? How did it benefit from the riches, the opportunities, the breakthrough in acceptance of handheld technology that the vertical market had been waiting for?

It didn't.

Ruggedized handhelds still run Windows Mobile in a form virtually unchanged from the days before Android and the iPhone. There is no multi-touch. There is no effortless tapping and panning and pinching and zooming. There is no apps store (there was one, but Microsoft closed it).

And worse, there is no upgrade path. Windows Mobile, which Microsoft merged into its embedded systems group a while ago, seems frozen in time. But isn't there Windows Phone 7, that's now Phone 7.5 and is currently heavily promoted with the launch of the Nokia Lumina 900 smartphone? There is, but Windows Phone is totally different from Windows Mobile. There is no upgrade path. And even if there were, it's a market where there are already half a billion iPhones and Android smartphones, and people who know how to use them and who expect nothing less. Not in their personal lives, and not on the job.

That is a definite problem for those in the market of making and selling ruggedized handhelds. And the problem is not demand. With the world now pretty much convinced that handheld computing and communication devices are tremendously useful and will only become more so, no one needs to be sold on the merits of handheld technology on the job. Everyone knows that already.

The problem is that the business market now wants smartphones that are a little (or even a lot) tougher than a consumer phone, and perhaps can do a few things consumer phones don't do so well, like scanning. But the market wants that extra toughness and those extra capabilities without giving up the elegant, effortless user interface, the bright high-res displays, and the ability to take pictures and HD movies so good that consumer smartphones are now replacing dedicated digital cameras.

And that's why it is becoming increasingly difficult to sell handhelds that offer technology and functionality that is by now very dated by consumer smartphone standards. Sure, the technology and functionality of most ruggedized handhelds are as good and better as they were six years ago, but the world has changed. Sure, the vaunted Microsoft leverage argument ("You use Microsoft in your business, so Windows Mobile fits right in and you can leverage your existing investment") still applies. But that is no longer enough. Businesses who need to equip their workers with rugged handhelds now want more.

But isn't the mere popularization of handheld technology enough to make rugged technology vendors make a good living? Perhaps. It all depends on the type of business and its inherent profitability. But is basically standing still a good business strategy in a technology boom measuring in the hundreds of millions of consumer handhelds? And are the largely flat financials of rugged handheld makers not a warning sign?

There are many possible scenarios. For example, perhaps we're seeing a total separation of consumer and vertical markets, one where consumer handhelds get ever more powerful while much more rugged vertical market computers pursue a small niche where they simply won't ever be challenged by consumer technology. And perhaps Microsoft will manage to somehow leverage a successful unified Windows 8 Metro-style user interface into handhelds that can become the true successor of Windows Mobile, with whatever benefits customers see in remaining within the Microsoft fold. And perhaps there really is an insurmountable challenge in making capacitive multi-touch suitable for rugged applications (this is often voiced as a reason, though I can't quite see it).

But there are also darker scenarios that bode less well for the verticals. If consumer phones aren't tough enough or don't have certain peripherals, third parties may simply make rugged cases and enclosures to make them tough, and sleeves and caddies to add whatever functionality business customers want. Without losing the performance and capabilities of a consumer smartphone. In that case, what could and should have been a golden opportunity for vertical and industrial handheld makers might simply vanish as consumer technology eats their lunch.

As is, it's become somewhat painful to see vertical market companies struggle, companies that know so well how to make products that hold up under trying circumstances, products that don't leak, products with displays that can be read in bright sunlight, products that will last years rather than months, and products that are tailor-made so well for very specific needs. Those companies have a lot of valuable expertise and so much going for them.

But will all that be enough to mask and make up for an increasingly wider gulf between vertical market and consumer market technology? Only time can tell, and it may be running out.

Posted by conradb212 at 04:59 PM | Comments (0)

April 24, 2012

e-con Systems executive explains the reality of cameras in rugged computers

A little while ago I had an email conversation with the folks at e-con Systems. They are an embedded product development partner with significant expertise in camera solutions in the Windows CE and Windows Embedded space. The company offers a variety of lens and camera modules that can be interfaced with most of the common handheld processors from TI, Marvell, FreeStyle and others. My interest was, as I discussed in earlier RuggedPCReview.com blog entries, why at a time when every new smartphone includes a superb camera capable of full HD 720p or 1080p video, the cameras built into rugged devices lag so far behind.

Here is what Mr. Hari Shankkar, co-founder and VP of Business Development of e-con Systems had to say:

"We have worked with several rugged handheld manufacturers and they use our device driver development services or our camera modules. Based on this experience and our interactions with them, here are our comments:

  • There is a big difference in the way rugged computers are constructed and devices such as digital cameras or smartphone are built.
  • The bulk of the NRE effort goes to making the device rugged and only a very small percentage is left when it comes to the camera. In the case of a digital camera or a cell phone this is not the case as the cameras are given higher importance.
  • These devices are sold through tenders and it is mostly B2B (business-to-business) and not B2C (business-to-consumer) like the cell phone cameras and the digital cameras. The request for quantities is low, like a few hundred per month or per quarter. We have personally not seen these tender documents but from what we have been told, the emphasis is given more to the ruggedness than to the camera side. The camera is needed but customers are more concerned about the resolution of the pictures and whether they can capture 1D/2D barcodes with it.
  • Some of the cameras with ISPs (image signal processors, for backend digital processing) don’t work at very low temperatures; only raw sensors work at such low temperatures. This means you have to have an external ISP on the board. But some of the manufacturers prefer to have the ISP developed in software and not have any hardware ISP. The digital cameras and the cell phone cameras have ISP integrated externally for high resolutions. This is one of the reasons you don’t see a rugged computer with a 8MP or a 14MP very often. Currently, the 8MP and the 14MP are raw sensors and no one has a ISP built in.
  • The image captured by the camera from a sensor can vary between the lens choices. A glass lens will give better quality than the plastic lens. However, we see most of the vendors going with camera modules having plastic lenses which of course affects the quality of the images you are capturing.
  • As long as the end customer demand is not that great for cameras, this will be like this. We see that integration of global shutter cameras (required for capturing stills when you are capturing a fast moving object) or integration of a glass lens not in the immediate future."

So what Mr. Shankkar is saying is that a) rugged manufacturers concentrate on the basic design to the extent where the camera is usually an afterthought (and our internal examination of most rugged designs confirms that), that b) there are some image signal processing issues that complicate matters for rugged applications, and that c) in the absence of higher customer demand, the quality of imaging subsystems in rugged designs is going to remain as is.

Those are certainly logical reasons, and as a provider of imaging solutions for handhelds and other devices, Mr. Shankkar is familiar with the thought process and priorities of rugged equipment vendors. And e-con Systems certainly has a roster of very competent camera modules (see e-con Systems camera modules).

Nonetheless, I cannot help but see a widening disconnect between rugged computer manufacturers and the digital imaging industries here. Integrating the imaging quality and functionality of, say, a US$200 GoPro Hero 1080p hybrid video camera into a high-end rugged data capture device simply ought to be doable. And if I can take superb high definition pictures and 1080p HD video with a 5-ounce iPhone 4s, the same ought to be doable in a rugged handheld or tablet. Yes, it would add cost, but these are not inexpensive devices, and the precision data capture requirements of many vertical market applications deserve no less than what any smartphone camera can do.

Posted by conradb212 at 10:50 PM | Comments (0)

April 18, 2012

The nature and potential of Windows 8 for ARM devices

Well, Microsoft announced in its Windows Blog (see here) that there will be three versions of the upcoming Windows 8. For PCs and tablets based on x86 processors, there will be plain Windows 8 and and the more business-oriented Windows 8 Pro that adds features for encryption, virtualization, PC management and domain connectivity. Windows Media Center will be available as a "media pack" add-on to Windows 8 Pro. A third version, Windows RT, will be available pre-installed on ARM-based PCs and tablets. Windows RT will include touch-optimized desktop versions of Word, Excel, PowerPoint, and OneNote.

That, mercifully, cuts down the available number of Windows 8 versions from five in Windows 7 (Starter, Home Basic, Home Premium, Professional, and Ultimate) to just three, if you don't count additional embedded and compact varieties.

While Microsoft's April 16 announcement on the versions was interesting, what's even more interesting is a long entry in Microsoft's MSDN blog back on February 9. It was called "Building Windows for the ARM processor architecture" (see here) and provided an almost 9,000 word fairly technical discussion of the ARM version of Windows 8. That one shed some light on how Microsoft intends to implement and position the next version of Windows, and make sure Windows won't be irrelevant in what many now term the "post PC" era.

As you may recall, Microsoft's initial Windows 8 announcements were a bit odd. Microsoft called Windows 8 "touch first" and made it sound as if Windows 8 were a totally multi-touch centric OS. While that certainly sounded good in a world awash in iPads, it seemed exceedingly unlikely that all those hundreds of millions of office workers would suddenly switch to touch devices. One could really only come to one conclusion: Windows 8 would most likely work pretty much like Windows 7 and Windows XP before it, but hopefully also somehow incorporate touch into the vast Microsoft software empire.

The MSDN blog goes a long way in explaining much of what we can expect. It's difficult to condense the very long post into some of the important basics, but it goes something like this:

Windows on ARM, which was originally called Windows WOA but was then renamed to Windows RT in the April announcement, should feel as much as standard Windows 8 as possible. To that extent, while the ARM version cannot run legacy Windows software, there will be a Windows desktop with the familiar look and feel, and also a lot of the familiar Windows desktop functionality.

Microsoft also emphasized that Windows RT will have a "very high degree of commonality and very significant shared code with Windows 8." So why can't it run legacy Windows software? Because, Microsoft says, "if we enabled the broad porting of existing code we would fail to deliver on our commitment to longer battery life, predictable performance, and especially a reliable experience over time."

That, however, doesn't mean there won't be Microsoft Office on the ARM version of Windows. In fact, every Windows ARM device will come with desktop versions of the new "Office 15," including Word, Excel, PowerPoint and OneNote. Will the ARM version of Office be different? Microsoft says that they "have been significantly architected for both touch and minimized power/resource consumption, while also being fully-featured for consumers and providing complete document compatibility." What that means remains to be seen. After all, the Windows CE/Mobile "Pocket" versions of the Office apps were also called Word, Excel, PowerPoint and OneNote, but with just offering a small fraction of the desktop versions' functionality.

From a cost point of view, x86 Microsoft Office runs between US$119 (Home and Student) to US$349 (Office Professional). Considering that Windows RT devices will likely have to be very price-competitive with iPads and Android tablets, including Office will put an additional cost burden on Windows ARM devices.

Now let's take a broader look at Windows RT and how it'll differ from standard x86 Windows 8. First of all, you won't be able to just buy the Windows RT OS. It only comes already installed on hardware. That's really no different from Android, and the reason is that the operating system on ARM-based devices is much more intertwined and optimized for particular hardware than x86 Windows that pretty much ran on any x86 device.

Microsoft also stated that it has been working with just three ARM hardware platform vendors, those being NVIDIA, Qualcomm and Texas Instruments. There are, of course, many more companies that make ARM-based chips and it remains to be seen whether other ARM vendors will remain excluded or if they, too, will have access to Windows RT. As is, while Windows has always predominately been x86, Microsoft occasionally also supported other processor platforms. For example, early Windows CE was considered a multi-processor architecture. Back in 1997, Windows CE supported Hitachi's SuperH architecture, two MIPS variants, x86, the PowerPC, and also ARM.

Another difference between the x86 and the ARM version of Windows 8 is that "WOA PCs will be serviced only through Windows or Microsoft Update, and consumer apps will only come from the Windows Store." So while x86 versions of Windows 8 application software will likely be available both through a Windows Store or directly from developers, Windows 8 ARM devices will follow the Apple app store model. That, of course, has significant control and security implications.

A further difference between Windows 8 x86 and ARM devices will be that while conventional x86 hardware likely continues to have the traditional standby and hibernation modes, ARM-based Windows devices will work more like smartphones and tablets that are essentially always on.

Now for the big question: How does Microsoft intend to bring Windows to such wildly different devices as a desktop PC and a tablet without falling into the same traps it fell into with earlier tablet efforts that were never more than compromises? In Microsoft's vision, by adding the WinRT, a Windows API that handles Metro style apps. From what I can tell, if a Metro application (i.e. one that only exists in the tile-based Metro interface) completely adheres to the WinRT API, then it can run both on ARM devices and also on x86 devices under their Metro interface.

What does that mean for existing software that developers also want to make available on ARM devices? There are two options. First, developers could build a new metro style front end that then communicates with external data sources and communicates through a web services API. Second, they could reuse whatever runtime code they can within a Metro environment. Either way, the old Windows leverage argument ("staff and developers already know Windows, so we should stay with Windows") won't be as strong as the WinRT API and Metro interface are new. How that will affect business customers who simply wish to stay with Windows instead of using iPads or Android tablets is anyone's guess.

I must admit that having gone though Windows for Pen Computing (1992), the Windows Pen Services (1996), and then the Windows XP Tablet PC Edition (2001), I am a bit skeptical of Microsoft's approach to Windows RT. It still feels a lot like hedging bets, cobbling yet another veneer on top of standard Windows, and claiming integration where none exists.

In fairness, the iPad has the same issues with Mac OS. The iPad is fundamentally different from a desktop iMac or even MacBook, and I am witnessing Apple's attempts at bringing the Mac OS closer to iOS with a degree of trepidation. But the situation is different, too. Microsoft's home base is the desktop and it now wants (and needs) to find ways to extend its leadership into tablets and devices, whereas Apple found a new and wildly successful paradigm that flies on its own and only loosely interfaces with the desktop (where most iPad users have Windows machines).

Bottom line? For now, while Windows 8 will undoubtedly do very well for Microsoft on the desktop and on laptops, it remains far from a certain slam dunk on the tablet and devices side. As I am writing this, Microsoft, AT&T and Nokia are on an all-out campaign to boost Windows Phone with the Nokia Lumina 900, but considering the massive head start the iPhone and Android have, nice though it is, Windows Phone remains a long shot. Windows RT will likely encounter a similar situation.

One possible outcome may be that Windows RT will lead to a resurgence of the netbook syndrome. Netbooks sold on price alone, though they were never very good. Low-cost Metro devices might pick up where earlier gen netbooks left off, with multi-touch and lots of post PC features, but still nominally being Microsoft and having Office.

Posted by conradb212 at 05:06 PM | Comments (0)

April 16, 2012

Will GPS drown in commercialism?

There are few technologies that have changed our lives and work as fundamentally as GPS. Not so very long ago, if you needed to know where to go, you used a paper map. Today we simply punch in where we want to go, then listen to directions and monitor our position on the GPS display. And industry, of course, has taken wondrous advantage of GPS, using it to optimize and manage transportation and location-based services to a degree never thought possible. GPS, by any account, is totally crucial to our modern world and society.

That's why a couple of recent observations worry me.

The first was when I left for San Francisco International Airport for a recent trip to Europe and my Garmin GPS did not find San Francisco Airport. Flat out did not find it. Not even in the transportation category. What it did find, though, was a hotel close to the airport. And so, since I was already underway and needed to concentrate on traffic, that's what I had to choose as my destination. Which promptly meant that I missed an exit. I have to believe that a Garmin GPS ought to find San Francisco International Airport, but mine didn't. All it coughed up was a hotel nearby.

After I returned from Europe, I needed to take my son to a local high school for a college orientation. I looked up the location of the college on Google Maps on my iMac and committed it to memory. In the car, I used the Maps app on my iPad, which is by Google, and the iPad drew the route from my home to the school. Except that it wasn't to the school. It was to a "sponsored location" nearby. Yes, the official Maps app on the iPad guided me to a "sponsored location" and not to where I wanted to go. Without telling me. It did place a small pin where I actually wanted to go, but the route it drew was to the sponsor location.

That is a very dangerous trend. Project it into the future, and you might see a situation where GPS might be as utterly unreliable and frustrating as email is today. Just as we drown in commercial spam, what if GPS apps likewise will drown us in "sponsored locations," making users sift through commercial GPS spam in order to find what we really need? That would make GPS not only useless, but potentially dangerous.

That, Google, would be evil indeed, and it's already evil that I am guided to a "sponsored location" instead of the clearly defined location I wanted to go to.

How does that relate to rugged computing? It's pretty obvious. What if commercial hooks begin hijacking routes? What if even official addresses are drowned in sponsored spam locations? Think about it.

And below you can see the routing to the sponsor location instead of the requested location marked by a pin (click on the image for a larger version).


Posted by conradb212 at 03:56 PM | Comments (0)

March 08, 2012

The new iPad -- both challenge and opportunity for rugged market manufacturers

If you want to sell tablets it's tough not to be Apple. And on March 7, 2012, it got that much tougher. For that's when Apple introduced the next version of the iPad, setting the bar even higher for anyone else.

Why do I even mention that here at RuggedPCReview.com where we concentrate on computing equipment that's tough and rugged and can get the job done where a consumer product like the iPad can't? Because, like it or not, the iPad, like the iPhone, sets consumer expectations on how computing ought to be done. It does that both by the elegance and brilliance of its execution, and by the sheer numbers of iPads and iPhones out there (Apple has sold 315 million iOS devices through 2011). That pretty much means anything that doesn't at least come close to offering the ease-of-use and functionality of the Apple devices will be considered lacking, making for a more difficult sell.

Unfortunately for anyone else out there trying to sell tablets, it's been tough. Somehow, while the iPad is simply a tablet, a way of presenting, consuming and manipulating information, it's been remarkably difficult for anyone else to convince customers to select them, and not Apple. Remarkable because Apple, despite its mystique, never managed to even make a dent into Microsoft's PC hegemony, and remarkable because of the number of vocal Apple opponents who shred whatever Apple creates seemingly on principle.

But let's take a quick look at Apple's latest version of the iPad, called not, as expected, iPad 3, but once again simply iPad.

No one ever complained about the resolution of the iPad display (1024 x 768), and everyone else stayed around that resolution as well, with lower end products perhaps offering 800 x 480, many using the old 1024 x 600 "netbook" resolution, and higher end products going as far as 1280 x 800 or the wider 1366 x 768. Well, with the new iPad Apple quadrupled resolution to 2048 x 1536, making for a superior viewing experience. Such high resolution is not necessarily needed, but if it's available for as comparatively little as Apple charges for iPads, everything else now looks lacking. And I can definitely see how the super-high resolution could come in very handy for many vertical market applications.

The new iPad also has two cameras. The new iPads we ordered will not arrive for another week and so I don't know yet just how good they are, but if the iPhone 4s is any indication, they will be very significantly better than what anyone else in the rugged arena has to offer at this point. I've long wondered why expensive, high quality rugged handhelds, tablets and notebooks come with marginally acceptable cameras, and the new iPads will only widen the chasm. The iPad cameras aren't only capable of offering fully functional video conferencing on their large screens, they can also snap rather high quality stills, and they can record 1080p full motion HD video, with image stabilization. And the iPad has the software to go with it. Few could claim this wouldn't come in handy for professionals in the field.

Advances on the technology side include a faster dual core Apple-branded ARM processor with quad core graphics and 4G LTE wireless broadband. Unless some rugged hardware we've seen over the years, iPads were never underpowered, and with the new chip they'll be snappier yet. And while 4G wireless isn't ubiquitous yet by any means, having it built-in certainly doesn't hurt. And then there's battery life, where the iPad, even the new improved one, wrings about ten hours out of just 25 watt-hours. And the whole thing still only weighs 1.4 pounds.

Now, of course, the iPad isn't rugged. It's durable and well built, and if you use it in one of its many available cases, it won't get scratched or dented, but it's not rugged. Its projected capacitive multi-touch screen famously cannot be used with gloves, you can't use a pen for when pin-point accuracy is required, and it's not waterproof.

None of which stopped the iPad from scoring some remarkable design wins in areas and industries that once did not look beyond rugged equipment. The FAA granted American Airlines permission to use iPads to replace inflight manuals and such, and American is deploying 11,000 iPads. Others will follow.

What does that all mean for companies that make rugged tablets? That the market is there. In fact, I believe the surface has barely been scratched. But it has to be the right product. Apple showed the way with the iPad but, with all due respect to those who've tried so far, few followed with more than a timid effort. It's been mostly wait-and-see, and now Apple has set the bar higher yet. That doesn't mean it's over for anyone else, but it's gotten tougher yet. The new iPad will boost acceptance of the tablet form factor and functionality to higher levels yet, and that still means opportunity for everyone else.

I am convinced that there's a large and growing demand for a more rugged tablet, and that whoever comes out with a product that doesn't just approximate but match and exceed expectations will win big.


Posted by conradb212 at 04:34 PM | Comments (0)

January 26, 2012

A conversation on imaging in rugged handhelds

Recently I received an email from someone in the industry that concluded with the question: "Wouldn't a conversation on imaging in rugged handhelds be interesting to your readers?"

The answer, of course, is "definitely," and so I responded as follows:

"I recently wrote two articles on the general state of imaging in handheld/mobile systems, so you basically know where I stand. In essence, given the very rapid advance in HD still/video imaging thanks to a convergence of CMOS, tiny storage formats, and H.264 compression technology (Ambarella!), it's now possible to generate excellent high resolution stills as well as near perfect 1080p/30 and better video in very small packages, packages that are small enough to fit into handheld and mobile computers.

"Yet, while we see tiny $200 GoPros and such, and advanced still/video capability in virtually every smartphone, the imaging technology we find in almost all rugged computers, even high-end ones, is lacking. Though we review and examine numerous mobile computers every year, we have yet to find a single one that has hybrid imaging capabilities that come close to what is possible today, and most are, in fact, barely usable. It is inexplicable to me how a $4,000 ruggedized notebook computer or tablet does NOT include competent imaging subsystems. There is room, there is a need, and the costs are not prohibitive.

"What enables me to make those statements? First, I have been reviewing rugged mobile computing technology for almost 20 years. For the past ten or 15 years, imaging in mobile computers has barely advanced. Second, I co-founded Digital Camera Magazine in 1997 (as the first magazine anywhere to concentrate solely on digital cameras). I continue to follow digital imaging closely and we also do digital imaging reviews as time allows. Third, as an enthusiastic scuba diver (see my scubadiverinfo.com), I have done many underwater imaging product reviews, including a couple on the GoPros (see here). Fourth, in working with several embedded systems vendors, I know what's possible in terms of integration. What I do see is an almost total lack of communication between computer and imaging people.

"I was not familiar with your company, but I see that you are in part concentrating on camera modules. Which means that you are probably painfully aware of the situation. What must happen is much better integration of much better imaging capabilities into mobile computers. At a time where I can produce near Avatar-quality underwater 1080p 3D video with two GoPros, and where world events are routinely reported on smartphones, mobile computers are woefully out of touch with imaging. A professional who pays $4,000 for a rugged computer (or even just $1,200 for a rugged handheld) should expect no less in terms of imaging quality and ease-of-use than you can get in a cheap digital camera (i.e. sharp pictures, a decent interface, HD video, and speed). Instead, what we currently have in most mobile computers is simply not nearly good enough. You could never rely on it even for quick, reliable snapshots in the field, let alone quality imaging.

"Think about it: businesses spend a lot of money to equip their personnel with expensive mobile computing equipment. Much of that equipment is used for data capture, sight survey, recording, reporting, etc. It makes zero sense to me to have vast computing power, a great outdoor viewable display, great communication and data capture technology, .... and weak rudimentary imaging that is in no way suitable or sufficient.

Posted by conradb212 at 09:02 PM | Comments (0)

January 20, 2012

Who's Using Rugged Tablet PC Systems

Just as tablets have become indispensable to consumers, rugged tablets are becoming more integral to business.

At first, Rugged Tablet PC systems were used by the military, where they had to be able to withstand very hostile environmental conditions. But over time, they've found uses in a number of other industries, including:

  • Retail
  • Field Service
  • Manufacturing and Warehousing
  • Transportation and Logistics
  • Field Sales and Service
  • Food and Beverage Distribution
  • Military and Public Safety
  • Agriculture

Meanwhile, new studies show that tablets and other handheld devices are now outselling laptops 2-to-1. With such widespread adoption, many companies are likely to find that rugged tablets make business more efficient, seamless, and ultimately more cost-effective.

When Should I Start Using Rugged Tablet PCs?

Rugged tablets are ideal for companies that typically deploy laptops to their field workers, or those that issues PDAs and other handheld devices to their mobile workforce. Other companies find that rugged devices are especially useful for their warehouse operations.

But not everyone is immediately sold on rugged tablets. A lot of people still don't know if their features and benefits justify the up-front cost. Answers to a few key questions can help them decide:

  • Is the device exposed to water and shock?
  • Is it likely to be dropped?
  • Does the user travel often, or work off-site?
  • Will it have to work in extreme temperatures?
  • Does it need a long battery life?
  • What functions does it need to perform?
  • Will it be used during most of the workday?
If the answer to most or all of these questions is yes, then a rugged device is not only useful, but essential for doing business.

But how rugged is rugged enough? Tell us about your application.

Posted by mobiledemand at 02:57 PM | Comments (0)

January 18, 2012

Test 2

Testing templates

Posted by mobiledemand at 11:40 PM | Comments (0)

Testing

Test to see use of entry categories.

Posted by mobiledemand at 08:36 PM | Comments (0)

November 21, 2011

Ruggedized Android devices -- status and outlook

As far as operating system platforms go, the rugged mobile computing industry is in a bit of a holding pattern these days. Thanks to the massive success of the iPhone and iPad there is a big opportunity for more durable handhelds and tablets that can handle a drop and a bit of rain, yet are as handy and easy to use as an iPhone or iPad-style media tablet.. On the tablet side, a lot of enterprises like the iPad form factor and ease of use, but they need something a bit tougher and more sturdy than an iPad or a similar consumer product. On the smartphone side, hundreds of millions use them now and expect the same elegance and functionality in the handhelds they use on the job. But again, those professional handhelds need to hold up to abuse and accidents better than your standard consumer smartphone.

So with dozens and perhaps hundreds of millions of Android smartphones sold, and tens of millions of iPads, why are the likes of Lowe's home improvement center equipping their employees with tens of thousands of iPhones instead of presumably more suitable ruggedized handhelds (see Bloomberg article)? And why do we see iPads being sold into enterprise deployments that used to be the exclusive province of rugged tablets? There isn't one easy answer.

On the tablet side, it almost looks like the enterprise seems to want iPads and nothing else. Which is a problem for anyone who isn't Apple as the iOS is proprietary and Android-based tablets simply haven't caught on yet. That may be due to the perception that Android is really a phone operating system, or potential customers are befuddled over the various versions of the Android OS.

On the handheld side where Android has successfully established itself as the primary alternative to the iPhone, it would seem to be easy to offer Android-based ruggedized smartphones and handhelds. But there, too, the majority of recent product introductions still used the by now ancient Windows Mobile, an OS that looked and felt old nearly a decade ago.

So what gives? A few things.

With tablets, the almost shocking lack of success of Android and other alternate OS tablets has had a cold shower effect. If neither Motorola Mobility (Xoom) nor RIM (Playbook) nor Hewlett Packard (TouchPad, Slate 500) can do it, who can? And then there's Microsoft's promise to finally getting it right on tablets with the upcoming Windows 8. That's far from certain, but in a generally conservative industry where almost everything is Microsoft, the usual Microsoft leverage/investment/integration arguments carry weight.

With handhelds and smartphones, it's harder to understand because non-Microsoft platforms have traditionally been far more successful, and in the era of apps, software leverage hardly matters anymore. Perhaps it's Microsoft's heavy-handed forcing Android vendors into paying them, and not Google, royalties. Perhaps it's some sort of fear not to stray too far into uncharted waters. It's hard to say. Almost everyone I talk in the industry admits, off the record, to keeping a very close eye on Android developments.

So that all said, where do we stand with respects to Android-based products in the vertical/industrial markets where durability, ruggedness and return-on-investment and total-cost-of-ownership matter?

In tablets, there have been two recent introductions. One is the Motorola Solutions ET1, a small 7-inch display ruggedized enterprise tablet. It's based on a TI OMAP4 processor and runs Android 2.3.4, i.e. one of the "non-tablet" versions. The ET1 was said to be available in Q4 of 2011. RuggedPCReview reported on the device here. The other notable introduction is the Panasonic Toughpad, introduced in November of 2011, but not available until the spring of 2012. The Panasonic Toughpad is a Marvell-powered device with a 10.1-inch screen and runs Android 3.2. Both devices seem to be what a lot of enterprise customers have been waiting for: more durable versions of consumer media tablets, fortified for enterprise use with beefed-up security, service and durability without sacrificing slenderness, low weight and ease-of-use.

On the handheld side, we've also come across some potentially interesting products. The first is the ADLINK TIOT2000 (see our report), a conventional resistive touch handheld with a QVGA display. What's interesting here is that ADLINK offers a visually identical version, the TIOT9000 (see here) that runs Windows CE, with the Android version using a Qualcomm 7227T processor and the Windows CE version a Marvell PXA310. Winmate just introduced its E430T, an industrial PDA with a large 4.3-inch display that uses capacitive touch. This machine uses a Texas Instruments DM3730 processor and is said to be able to run Android 2.3 or Windows Mobile 6.5. I've also seen Android listed as an alternate OS on some of Advantech's embedded modules, including the TI OMAP 3530-based PCM-C3500 Series (see here).

On the surface, it would seem to be almost a no-brainer to cash in on the great public interest in tablets/smartphones and the opportunity a new-era OS such as Android provides. But nothing is ever as easy as it seems.

For example, there's a big difference between traditional rugged tablets that usually either have very precise digitizer pens or a resistive touch screen (or often both), and iPad class devices that use capacitive touch that lets you do all that tapping and panning and pinching, but generally doesn't work in the rain or under adverse conditions. The same issue exists on the handheld side where the traditional Windows Mobile is clearly designed for use with a passive stylus and cannot easily take advantage of capacitive multi-touch. That has, however, not stopped Casio from introducing the IT-300 that has a capacitive multi-touch display, yet runs Windows Embedded Handheld 6.5 (see our report).

So it's all a bit of a mystery. The transition to new operating platforms is never easy and often traumatic, and there are good arguments for being cautious. For example, in addition to leverage, one of the big arguments for Windows CE/Windows Mobile has always been the wealth of existing software. True, but in a world of tens of thousands of often very slick and sophisticated iOS and Android apps, it's hard to believe developers wouldn't quickly come up with the appropriate versions and apps.

With tablets, the situation must be quite frustrating for manufacturers of rugged mobile devices. They undoubtedly see a great opportunity to cash in on the tablet boom, but they are to a degree caught between needing to support the existing Windows XP/Windows 7 infrastructure and deciding what to move to next. Microsoft is cleverly dangling a (for them) no-lose carrot in the form of Windows 8's Metro interface where ARM-based devices would only run Metro and have no access to "classic" Windows whereas for X86-compatible devices, Metro would just be the front end. So there are three potential success strategies: Android, Metro-based ARM devices, and X86 tablets that run Metro and classic windows. No one can support all three.

So for now, as far as rugged tablets and handhelds go, it's the best of times and it's the worst of times.

Posted by conradb212 at 04:42 PM | Comments (0)