Industry sponsors:
HOME | NOTEBOOKS | Tablets | Handhelds | Panels | Embedded | Rugged Definitions | Testing | Tech primers | Industry leaders | About us
Sponsors: Advantech | Dell Rugged | Getac | Handheld Group | Juniper Systems | MobileDemand
Sponsors: Motion Computing | Samwell Ruggedbook | Trimble | Winmate | Xplore Technologies

May 24, 2016

Household items: coding, standards, and "2x" pics

Back in the day when we published Pen Computing Magazine and Digital Camera Magazine and some other titles in print, we always prided ourselves to be in total control of our own destiny. We did virtually everything inhouse — writing editing, photography, layout, prepress, web, marketing and advertising — and most of us had mastered several of those disciplines. We didn't want to farm anything out or rely on any one expert.

We felt the same about software. We had our own webhosting, our own servers right in our building, and we programmed everything ourselves. That way, no one could force us to update or upgrade when we didn't want to, no one could quietly put more and more unrelated ads, pop-ups and clickbait onto our pages, and no one could suddenly go out of business on us. No one could control us or buy us out either, because we financed everything ourselves. One doesn't get rich that way, but it pretty much guarantees continuity and longevity.

That doesn't mean we didn't run into issues. The author of a terrific piece of software that we used and loved was of the paranoid sort and, even though we paid a hefty price for the system, insisted on compiling everything and locking the software to our particular hardware. So every time we upgraded our servers even in a minor way, we had to go and beg the man for a new code. That became increasingly more difficult, and eventually he refused altogether.

Fortunately, that was an isolated incidence, which is a good thing as we use dozens of tools and utilities that run on our own server and without which we couldn't do business. Many are orphaned or haven't been updated in many years. But they still work, and they do the job better than what replaced them.

RuggedPCReview.com is a vast site with thousands of pages. Yet we don't use a content management system or anything like it. We handcode everything. Sure, we have utilities and scripts and routines that make the job easier, but when a new page goes up, it hasn't been generated by rev. 17.22 of version 5.21 of some corporate software system. It's all coded by hand.

Don't get the idea, though, that we're hidebound and unwilling to go with the flow. We routinely evaluate whatever new tools and systems that come along. A few years we analyzed HTML 5 and recreated part of RuggedPCReview in pure HTML 5. It was an interesting and stimulating exercise, and we adopted part of HTML 5, but didn't see a need to convert everything.

More recently we took a look at Word Press. Like Movable Type that we still use (and run on our own server), Word Press started as just blog software. It's now morphed into a full-fledged content management and site generation system, one that's replacing more and more conventional websites. As we had done with HTML 5, we analyzed Word Press and recreated RuggedPCReview in Word Press.

We rejected Word Press for a variety of reasons. First, we used tables everywhere, and Word Press is terrible with tables. Second, Word Press is based on modules that are pretty much black boxes. You don't know what they do and how (unless you want to dedicate your life to learn and decipher Word Press in detail). We don't like that. Third, Word Press layout is terrible. Even the best templates look like pictures and text blocks have randomly been dropped on a vast ocean of white background. And forth, and most egregiously, with Word Press sites you never know if an article or posting is current or three years old.

So thanks, but no thanks. Which means that when we need to implement a new feature on our site, we have to be creative. A couple of years ago one of our much appreciated sponsors was unhappy that sponsor logos were listed alphabetically, which meant that some sponsors were always on top and others at the bottom. A reasonable complaint. Word Press likely has some black box for that, or maybe not. Our solution was to find a script and modify it for our purposes. It's been working beautifully.

Technology advances at a rapid pace, of course, sometimes for the better and sometimes you wonder what they were thinking because what came before worked better. That's mostly the case with software; hardware advances are generally a good thing. But here are a couple of examples of how advances in hardware affect running a site like RuggedPCReview.

There was a time when the web was on desktop and laptop monitors, and phones either didn't have anything like the web, or some separate abbreviated version of it, like the unfortunate and ill-fated WAP mobileweb that was on older feature phones. But with smartphones getting ever larger displays and ever more powerful electronics, there really wasn't a need to have two separate webs. Standard web browsing works just fine on phones.

Problem is that even a 5.5-inch screen like the one on the iPhone 6 Plus is awfully small to take in a webpage. You can, of course, quickly zoom in and out thanks to the wonders of the effortless capacitive multi-touch, but that, apparently was a thorn in the eyes of interface developers. So we're seeing all those efforts to make sites "mobile-friendly." The currently prevailing school of thought is to have sites consisting of blocks that arrange themselves automatically depending on the size and width of a display. So if you have three pictures next to one another on a standard desktop browser, on a smaller screen the three pictures will rearrange themselves and become stacked on top of one another. Same with text blocks and other site elements.

That may seem like a brilliant solutions to programmers, but it's a hideous aesthetic nightmare in the eyes of anyone who's ever done layout and crafted pages just so. The mere idea that this could be a solution seems preposterous. So we're staying away from that nonsense.

But there are other issues. One of them is resolution. There was a time when most desktop and notebook displays used the same resolution, with every few years bringing a new standard that would them slowly be adopted. 640 x 480 VGA was gradually replaced by 800 x 600 SVGA which, in turn, was slowly replaced by 1024 x 768 XGA. Handhelds were in their own category with proprietary screen resolutions (like Palm) or the 240 x 320 QVGA of Pocket PCs.

That first changed when "wide-format" displays became popular. Where once everything had been displayed in the same 4:3 aspect ratio as TV sets, various aspect ratios quickly became an additional variable. The tens of millions who bought early netbooks will remember how the 1024 x 600 format favored on netbooks awkwardly cut off the bottom of numerous apps that were all formatted for 1024 x 768. And so on.

Then something else weird happened. While desktop and notebook displays only very slowly adopted higher screen resolutions, resolution virtually exploded on smartphones and tablets. Apple's introduced the concept of "retina" displays where, when looked at from the typical viewing distance of a class of device, individual pixels could no longer detected by the naked eye. As a result, high resolution quickly became a strategic battleground with smartphones. That led to the interesting situation where many smartphones with small 5-inch screens had the same 1920 x 1080 resolution as 15-inch notebooks, 27-inch desktop monitors, and 65-inch HD TVs. And now there are smartphones with 4k displays. That's 3840 x 2160 pixels, the same as 80-inch ultra-HD TVs.

What that means is that ordinary websites must now display in at least reasonable shape and quality on a device spectrum that ranges from tiny displays with insanely high resolution all the way to much larger desktop and laptop displays with generally much lower resolution, and often still using the old legacy resolutions.

Which is especially bad for pictures. Why? Well, let's take a look at RuggedPCReview. Ever since we started the site in 2005, all pages have been 900 pixels wide. That's because we wanted to comfortably fit into the 1024 pixel width of an XGA display. What happens when you look at the 900-pixel site on a full HD screen with its 1920 pixel width? Well, boxes and text and such scales nicely, but pictures suddenly look much worse. Now go on a 3840 pixel wide 4k screen, and the pictures are hardly viewable anymore.

So what does one do?

Turns out this is an issue that's been hotly debated for several years now, but a common solution hasn't been found. I did some research into it, and there are literally dozens of ways to make pictures look good on various sizes of displays with various resolutions. They use different technologies, different coding, and different standards, which means most may or may not work on any given rev of any given browser.

In general, how can the issue be handled? Well, you could have high-res pictures, have the browser download those, and then display them at lower resolution if need be. Or you could use on of the image formats where the picture starts of blurry and then sharpens. If all the sharpness isn't needed, the browser could simply stop the process. Or you could download various versions of the same picture an then display the one that makes most sense for a given resolution. One thorny issue is that you don't want to download a high es picture when all you need is a much mower res version. That'd be bad for bandwidth and loading speed. You also don't want to first load the webpage and then have it sit there with none of the pictures loaded while the software decides which version of a picture it should load, or while it's milling to get the picture into the optimal resolution. It's a surprisingly difficult issue.

After having read a good many articles on the issue, I was about to give up because all approaches seemed too complex to make sense for us to pursue.

But then I came across a solution that made sense. It's the "srcset" attribute that can be used with the standard HTML code for displaying an image. The way this goes is that you tell the browser to display a picture the way it always does. But the srcset attribute now also says that if the screen of the device the picture is to be viewed has such and such resolution, or is so and so many pixels wide, then use the higher resolution version of the picture! That sounds a bit difficult, but it's made easier by the fact that modern browsers know whether they run on a "1x" screen (i.e. a good old-fashined standard language display) or a "2x" screen (like, for example, the retina iMac27), or even a "3x" display (like a smartphone with insane resolution). Which means the browser only has to download one image, and it'll be the one that looks best on that particular display. Yeah!

There's one problem, though. And it's one that can be frustratingly difficult to solve. It has to do with picture size. Anyone who is familiar with modern compact or DSLR cameras knows that there is a huge difference in the size of a "raw" image and one that's saved in JPEG format. And also that pictures can be saved at various quality levels in the JPEG format. For most web display situations, the art of the deal is to compress as much as you can while still having a decent looking picture.

How does that affect having various versions of the same picture for different types of displays? Well, if a standard picture takes 50kb of storage space, a "2x" picture will take four times as much, 200kb. And a "4x" picture would weigh in at 16 times as much, or a hefty 1.6mb. It's easy to see that this can result in serious bandwidth munching and sluggish page loading.

Turns out, the human eye is very easily fooled, and so we can cut some corners, but it has to be the right corners. Trial and error revealed that with our RuggedPCReview site, saving a "2x" size JPEG in what Photoshop considers "low" quality at a "10" level takes roughly the same amount of storage as saving the same picture in a smaller "1x" size at "good" quality, which is Photoshop's "60" level. But doesn't the picture look terrible saved in such a high compression? Amazingly, not. It looks great, and much sharper than the higher quality low res picture. That means that while we still must create two versions of each picture, loading a page with a high-res picture on a high-res display takes no longer than loading the low-res picture!

That sounded too god to be true, but we tried it and it works. So from now on, whenever possible, all pictures used in new RuggedPCReview ages will have low-res and high-res versions of images.

Isn't technology great!?


Posted by conradb212 at 06:26 PM | Comments (0)

February 23, 2016

Cat S60 — More than the naked eye can see

They used to say, and likely still do, that a picture is worth a thousand words. That's certainly true, but it can also be quite misleading as pictures often tell a story rather than the story. There can be quite a difference between these two. The media is very adept at using carefully chosen pictures that tell a story that may or may not be so, or present the story with a slant or an agenda. One could almost say that a picture can tell a story in a thousand different ways. And in the age of Photoshop (generically used here; any image program can do), that's more true than ever. And let's not even talk about videos.

There is, however, another aspect of pictures. Most only tell what the human eye can see. Light is electromagnetic radiation, and only a small part of it is visible to the human eye as colors. It's the part with wavelengths of roughly 380 to 750 nanometers. Below that is ultra-violet and then x-rays and other rays. Above that first infrared, then microwaves and radio waves.

I've always been interested in the spectrum beyond what we can see.

My initial degree was in architecture, and that meant understanding the principles of heating and cooling as well as energy conservation. While we humans primarily feel temperature, temperature can also be seen. Technologies that make infrared wavelengths visible to the human eye can show us temperatures as colors.

As an enthusiastic scuba diver I learned the ways light behaves underwater. Colors are different underwater because waves travel differently, and some wavelengths are filtered out by water sooner than others. Lower energy waves are absorbed first, so red disappears when a diver reaches a depth of about 20 feet. Orange disappears next, at around 50 feet. Then yellow at about 100. Green stays longer and blue the longest, which is why things look bluer the deeper you go. But it's not always like that. For example, I found that sometimes I could see red at depths where it was not supposed to be visible. I wrote about that in Red at Depth.

The image below shows the same coral head at a depth of about 90 feet without artificial light on the left, and with the flash on the right. Without the flash, the red on the left ought not to be visible at all. And yet it is.

Then there's the interesting phenomenon of fluorescence. Fluorescence essentially describes the physical process of a substance absorbing light at the incoming wavelength, and re-emitting it at a different wavelength. Almost everyone has seen the ghostly effects of "black light" that makes some materials glow brightly while leaving others unaffected. Through scuba I found that there's a totally fascinating world of fluorescence under the sea. I described that in Night Dives Like you've Never Experienced Before.

The image below shows an anemone we photographed with a yellow filter and a NightSea fluorescent protein flashlight. In normal light you'd barely see the animal, but it is strongly fluorescent and lights up under certain incoming wavelengths.

Having founded Digital Camera Magazine in 1998 gave me an opportunity to witness the progression of digital imaging and also the opportunity of hands-on with a large number of different cameras and lenses. That knowledge and experience not only in cameras but also the underlying imaging technologies led to an interest in emerging uses and applications. That included explorations in ultra-slow motion imaging for science projects with my son, and examining the emerging action photography and videography market (it's safe to say that I helped GoPro understand how light behaves underwater; see The GoPro phenomenon: what the world-beating little 1080p vidcam can (and cannot) do).

Below you can see me testing different color filters in a Northern Florida spring with a specially built rig with two GoPro Hero 3 cameras.

In my work as publisher of RuggedPCReview and before that Editor-in-Chief of Pen Computing Magazine, I came to appreciate thermal modeling and design in microelectronics where proper cooling and removal of heat generated by the processor and related circuitry is among the most important aspects of any mobile computing design.

That's where infrared imaging comes into play. Human eyes and normal cameras cannot see infrared. Older generations will remember that infrared, commonly referred to as IR, was widely used for data communication between computers and devices, and it is still used in many remote controls. Older audiences will also remember the "Predator" movies where those merry human-hunting aliens saw the world in infrared. Infrared, or thermographic, cameras have been available for decades, but at a high price.

Recently, a company named FLIR Systems changed all that with a series of much lower priced thermographic cameras as well as the FLIR One thermal imaging camera module that snaps onto an iPhone or Android device. Not having a review relationship with FLIR, I pre-ordered FLIR One for my iPhone 6 Plus, and it arrived late 2015.

The FLIR One is an absolutely revolutionary product in that it lowers the cost of very functional thermal imaging to around US$250. The way the FLIR One works is that it shoots both a thermal image and a conventional picture. Why the conventional picture? Because thermal imagery doesn't provide much physical detail and it can be difficult for our eyes, unaccustomed to thermal data, to interpret the image. So what the FLIR One does is extract line-drawing type of detail from the conventional image and then merges it with the thermal image. That makes it much easier to exactly see what the thermal data pertains to. The FLIR One does all of that automatically.

When you show an IR picture to an uninitiated person, they will almost always assume it's just a Photoshop filter applied to a regular picture. But that's definitely not so. The thermal camera records data readings and then displays those on a color scale. FLIR One users can select from various color schemes. Since most people associate blue with cold and red with hot, I usually use the blue-red scheme.

What can you use the FLIR One for? Well, the applications are almost endless. The architect in me began a thermal imaging review of my home, identifying the efficiency of insulation and the presence of leaks. The scuba diver in me donned full scuba gear and examined hot and cold spots as those can definitely be an issue on a deep cold-water dive. And the reviewer of advanced rugged mobile computing products in me, of course, instantly began examining the thermal properties of gear in our RuggedPCReview testing lab. It's fascinating to see the heat signature of a mobile computing device, how it changes as the device heats up, and get an overall idea of how well the designers and engineers handled the thermal aspects of their products.

Below are some example of images taken with the FLIR One iPhone module. The image on the left shows a heater floating in an iced-over Koi pond. Dark blue is the ice on the surface of the pond, orange and yellow the warmer water kept ice-free by the heater. The image on the right shows the thermal design of a RuggON PX-501 tablet (see our full review). Yellow shows the heat-generating CPU and ancillary circuitry, as well as the copper heat conduit to the fan.

The two pictures below help architects, home owners and contractors. On the left, it's obvious which part of an attic room needs better insulation. On the right, you can literally see the cold air wafting in between two windows on an icy December day.

Is the FLIR One perfect? Not yet. While quadrupling thermal resolution over earlier low-cost efforts, it's still only 160 x 120, far, far less than the typical 8-16 megapixel recorded of the visible light spectrum. You don't need nearly as much resolution to convey useful thermal imagery (640 x 480 is considered high-res) and so the current low res is not a big problem. And now that FLIR has gained a foothold with the FLIR One and similar offerings, we'll likely see higher resolutions very soon.

But my story doesn't end here. In fact, you could say everything above is just an introduction to the news I had wanted to write about, the Cat S60.

Normally, we consider ourselves pretty well informed about anything in the rugged computing industry, but the Cat S60, officially introduced February 18, caught us by surprise. What is the Cat S60? Cat? Yes, Cat as in Caterpillar! Caterpillar actually sells a line of rugged smartphones on catphones.com. They all look suitably rugged, they all sport the familiar Cat logo, and their design language is reminiscent of Caterpillar's earthmoving machinery.

What immediately came to mind were branded signature edition of rugged hardware, like the "Hummer" version of one of the rugged Itronix notebooks several years ago. So did Caterpillar actually also make smartphones? Not really. Turns out that the Caterpillar phones come courtesy of UK-based Bullitt Group, a privately held technology company that apparently works with a number of tech companies. From what I can tell they either license famous brand names or work in joint ventures. As a result there are JCB phones (JCB is a British heavy equipment manufacturer), Kodak Phones, and Bullitt has even licensed the Marconi brand from Ericsson to launch a range of radios named after the very inventor of radio. Bullitt does about $100 million in sales a year, not tremendous, but very respectable.

What's really interesting, though, is that the Cat S60 is not just a rugged smartphone built to benefit from the Caterpillar image and name. No, it's actually what Bullitt calls "the world's first thermal imaging smartphone" and it has a built-in FLIR imaging camera. So you get thermal imaging built right into your rugged smartphone. The Snapdragon 617 octa-core powered Android Marshmallow phone itself has a bright 540 nits 4.7-inch procap display that can handle wetness and gloves. Ruggedness specs are quite impressive with a 6-foot drop, and what appears to be IP68 sealing. The Cat S60 is said to be waterproof down to 17 feet for an hour, and its 13mp documentation camera can supposedly be used underwater (see Catphones media release).

Wow. That is impressive indeed. Having a rugged smartphone with integrated thermal imaging capability opens up entirely new applications and functionality in any number of areas. How is this possible? With FLIR's tiny Lepton longwave infrared sensor that's smaller than a dime. For those interested in all the detail, here's the full FLIR Lepton datasheet in PDF format. Resolution of the initial Lepton imager is limited to 80 x 60 pixel, the same as in FLIR's iPhone 5 camera module, and entirely adequate for thermal imaging on a small smartphone screen. How much does the CAT S60 cost? US$599. Which seems almost too good to be true.

This is all very exciting. I don't know what Caterpillar's reach is in phones, given that we had actually never heard of their phone operation. Then again, I yesterday had a meeting with my landscape architect, and the man had not only heard of, but was very interested in the new Cat S60. I can absolutely see how offering rugged handhelds or tablets with an integrated FLIR Lepton camera can be a strategic advantage, especially if bundled with thermal imaging demos and apps. And having that kind of functionality in a product would not only be of great interest to many customers, but also definitely gold for marketing.

Posted by conradb212 at 03:28 PM | Comments (0)

February 15, 2016

Keeping an eye on the level of technology offered in consumer tech: Dell Venue 8

The consumer market is really, really tough. Sure, massive fortunes can be made off it thanks to the sheer size of it, and thus the potential of millions of units sold. But few products ever make it into that sales stratosphere, and the competition is brutal. Make one mistake, be it in technology, manufacturing, marketing or just about anywhere else, and the product tanks, expensively. Add to that the fickle taste of consumers, the unpredictability of trends, a lightening-quick product cycle pace, and the true successes are few and far between. Leaving some very good and often excellent products behind. Or at least underappreciated.

That all came to mind as I spent the weekend playing with my latest impulse buy, a 7000 Series Dell Venue 8 tablet. First available early 2015, the Venue 8 is an 8.4-inch consumer tablet that's part of Dell's efforts to establish itself in mobile technology. That effort had seen the short-lived Dell Venue smartphones and then a re-introduction of the line late 2013 as tablets. Venues tablets are available both in Android as well as Microsoft Windows versions, the latter as the Venue Pro.

So why did I get a Venue tablet? In part because I've always liked Dell. I liked the old Dell Axim handhelds of the Pocket PC era, I like the various rugged Dell tablets and notebooks we've tested here at RuggedPCReview.com, and I liked Dell's decision to take the company private so as not to be at the mercy of Wall Street analysts whose quarterly growth expectations must be met lest they "worry" or, worse, become "concerned."

In this instance, I received an email alerting to a special deal on the Venue 8. I decided to check it out as the trusted Google Nexus 7 I'd been using as my personal small Android tablet, and also as a point of reference whenever we test a rugged Android device, had outlived its usefulness. After the latest OS upgrade — which to Google's credit was always quickly available for the Nexus 7 — the device had become so sluggish as to be useless. Could I not just ask one of our sponsors for a long-term loaner? I could, but I always like to have something that's truly mine and not subject to a sudden unexpected recall for inventory purposes or such.

Add to that that the deal was very sweet. Just US$199 for a 16GB Dell Venue 8 7840 running Android 5.1 (Lollypop). That's $200 off the regular US$399, and shipping included. Fast shipping, too, as the package arrived at my doorstep just a day and a half after I ordered the tablet.

Now, since I am writing this for the RuggedPCReview.com blog, let me make something clear right off the bat: the Venue 8 is NOT a rugged device. It's of the standard consumer/business variety that Dell has always specialized in. So why the write-up if it's not a rugged device? Because it's always good to see what consumer technology is up to and what consumers expect, and get, for their money. With the massive global reach of smartphones and tablets, what consumers expect from their personal gear has a direct impact of what they expect from rugged gear. So there.

If that's the case, and according to our experience it is, then every manufacturer of rugged mobile computing gear should get a Venue 8 and study it. Because consumers get an awful lot of very advanced technology with this tablet, even at its US$399 list price.

First, there is the 8.4-inch display with massive 2,560 x 1,600 pixel resolution. That's 359 pixels per inch and incredibly sharp. It's an OLED (organic light emitting diode) screen that's also vivid with deep blacks and intense colors. The display has absolutely perfect viewing angles from any direction. Brightness ranges, depending on what review you want to believe, from 250 to 430 nits. I consider it very bright. The folks at Anantech actually didn't like the display very much in their review of the Venue 8 (see here). But compared to most displays in rugged devices, it's very, very good.

For a processor, the Venue 8 has a quad-core Intel Atom Z3580 with a maximum burst frequency of 2.33GHz. The Z3580 is part of the "Moorefield" lineup of Atom chips and designed specifically for smartphones and tablets. It's different from Bay Trail chips in that it's not using Intel HD Graphics, but a PowerVR Rogue G6430. There's 2GB of RAM and 16GB of eMMC mass storage, plus up to 64GB via externally accessible micro SD card. There's Bluetooth 4.0 and fast 802.11ac WiFi via an Intel 7260 WiFi + BT 4.0 module. And the 21 watt-hour battery is supposed to last 9.5 hours.

The Venue 8 has not just two, but four cameras. Yes four. There is the standard frontal conferencing cam (2mp), there is the 8mp rear-facing documentation camera, and then there are two supplementary 1mp cameras that work in conjunction with the 8mp to allow depth measurement as well as adjusting focus on a picture after it's been taken. You don't see that anywhere else.


The whole thing is packaged into an anodized aluminum case that measures 8.5 x 4.9 inches and is less than a quarter of an inch thick. This tablet makes the sleek iPhone 6 look a bit stout. Weight is 10.75 ounces.

So how well does the Venue 8 work?

Very well. The Venue 8 has a exceptionally high quality feel to it, in that Apple-esque way that makes it feel like the device is milled from a solid block of metal. And the OLED display is just gorgeous with its rich, vibrant colors and deep blacks. It's like looking at a particularly good Plasma TV compared to regular LCD TV. I/O is minimal. There's the tiny micro-USB jack for charging. There's the micro SD card caddy. A headphone jack. And then the small volume rocker and power switch.

What's a bit unusual is the asymmetrical layout with 3/16th of an inch bezels on three sides and then a much heftier 1-3/16 on the 4th. That's good news for the speakers which get much more real estate than they usually get in a small tablet, and they face forward for good sound. That makes for a largish area where to hold the tablet, but both the front and the rear main cameras are also located there, which means it's easy to inadvertently cover them. Overall, I like the arrangement.

In terms of performance, the Venue 8 is quick. Overall, few of the Android devices that I've tested or worked with are as smooth as anything that comes out of Apple, with a degree of stuttering here and there common. There's very little of that on the Venue 8. It's generally quick and responsive, and a pleasure to use.

Literally billions are familiar with Android now, which means that whatever quirks Android has — and it still has its fair share of them — do not affect its popularity. The vast variety of Android hardware and the numerous versions of Android itself, however, still mean that some apps are either not available for a particular device or version, or they are not optimized for your particular device.

Some reviewers have complained about very small text and icons on the Venue 8 due to its very high resolution on a fairly small display. I did not find this to be an issue under Android Lollipop, and certainly much less of an issue than it is on most small Windows tablets.

The "depth" camera assembly with its main 8mp camera flanked by two 1mp complementary cameras has me baffled. The idea here is that the two subsidiary cameras allow to capture depth information that can then be used to do a variety of things to a picture, like focus on certain areas, applying filters to certain areas, or even measuring distances and areas. It can also be used to widen or narrow the depth of field for artistic purposes.

Unfortunately, this is only marginally documented, and didn't work all that well for me. On top, the 8mp camera itself isn't nearly as good as most people have come to expect from their smartphone cameras. So that's a disappointment. It does, however, still work better than most cameras in rugged systems. I understand the need to differentiate a product from the competition, but in this instance I'd have preferred one really excellent documentation camera instead of the triple camera experiment.

Battery life is excellent, especially considering there's only 20 watt-hours, and the device is less than a quarter inch thick. Battery life is so good that, like in iPads, it simply ceases to be an issue.

So what does it all mean as far as rugged computing is concerned? Since it's not a rugged device, nothing directly. However, the Venue 8 demonstrates the level of technology and features that's available to consumers for very little money. And the impressively high quality. The vibrant OLED display with its very high resolution. All for US$399 list, or the ridiculously low US$199 on special.

And what does it mean as far as manufacturers of rugged tablets are concerned? Simply that the consumer market is spoiling consumers with advanced technology that's hard to match in low-volume ruggedized gear with much longer product cycles. So I don't expect to find all this enticing high technology in rugged computing products for the job. But it definitely IS good to keep an eye on what consumers are getting for very little money. Because those consumers then want the same in their professional gear.

Posted by conradb212 at 07:55 PM | Comments (0)

January 19, 2016

Follow-up on iPad Pro and Apple Pencil

I've now had the iPad Pro for a good couple of months and the Apple Pencil for a month and a half. How do I use them? Have they changed my life?

As far as the iPad Pro goes, it has totally replaced my iPad Air 2. I don't think I've used the Air 2 once since I got the Pro. However, I am doing the exact same things on the Pro that I used to do on the smaller Air 2. The split screen functionality is not good or compelling enough to really work with two apps at once, and it's nowhere near universally supported.

So I use the Pro just as a larger iPad. Although the Pro is significantly heavier than the Air 2, and almost a bit unwieldy, apparently the bigger screen and the fact that it's a good bit quicker than the Air 2 are enough for me to use the larger Pro.

I'm disappointed that there really are no apps that are truly "pro" in the sense that they add undeniable value to a larger device, make it a professional tool instead of just the device that the iPad always has been. For now, there really is no difference.

How about the Apple Pencil? As someone who has worked with and written about pen computing technology for over 20 years, I should be a primary candidate for the Apple Pencil. I should be thrilled that Apple is finally recognizing the pen as the important productivity tool it can be.

But I am not.

I played around a bit with the Apple Pencil when I first got it, but haven't used it since. That's not because I am no longer a fan of pens. It's because Apple just didn't get it right. The Apple Pen is too large, too slippery, and too poorly supported. You never know if an app will really support it or just part of what the Pencil can do.

And having a pen with a battery is just unappealing, especially when its primary charging mechanism, to stick the pen into the Pro's Lightning connector, is just too bizarre. As is, when I look at the Pencil and feel like I want to try it again, the first thing that comes into my mind is that it probably needs charging first. And I move on.

Add to that the fact that there's no garage for the pen on the big Pro, and the $99 Pencil seems almost like an effort by Apple to emphasize Steve Jobs' point: we don't need a pen!

All this baffles me. I really wanted to like the Pencil. But the way Apple went about it is like Microsoft went about the Kinect. An expensive add-on that shows flashes of brilliance, but overall just doesn't work well enough for people to want it.

Posted by conradb212 at 10:04 PM | Comments (0)

December 06, 2015

An assessment of the Apple Pencil

A few weeks after the Apple iPad Pro began shipping, the Apple Pencil is now available also. This is big news because it was Apple who finally made the tablet form factor a success, and they did it without a pen. Which is remarkable, as tablet computers initially were conceived specifically as a modern equivalent of a standard notepad that you wrote on with a pen. And remarkable again as Steve Jobs was adamantly opposed to pens and often listed his reasons why he felt that way.

But now the Apple Pencil is here, a good 5-1/2 years after the iPad was first introduced. Why did Apple do it? Perhaps it's because Samsung has been selling tens of millions of their Note tablets with pens. Perhaps it's because being able to do quick illustrations or annotate text documents with handwritten notes simply makes too much sense to ignore. Perhaps it's a tacit acknowledgment that fingerpainting is limiting when it comes to anything but the simplest of artistic expression. Perhaps it's because the big iPad Pro simply needed something to differentiate itself from smaller and lesser iPads. Or perhaps it's all of the above, or something else entirely. Fact is, the Apple Pencil is here.

The big question now is how, and how well, the Apple Pencil works, and what it might mean for Apple. After all, the pen is what Jobs felt was so unnecessary.

A brief history of using pens with tablets

But before I go into my own opinions and experiences with the Apple pen, I want to outline the big picture. As stated above, tablets were initially conceived as a modern day replacement of pen and paper. And they've been around for over a quarter of a century. Contrary to what a lot of people believe, Microsoft did not invent the tablet in the early 2000s. Already back in 1989, the Momenta tablet was available, and it sparked great excitement over a future where tablet computers you could write on with a pen would replace the conventional PC.

In the early 1990s, every major computer company offered a tablet. At a company named GRiD, Jeff Hawkins (who would later invent the Palm Pilot) designed the GRiDPAD and the PalmPad. NCR had the NotePad, Samsung the PenMaster (see below), Toshiba the DynaPad, IBM the ThinkPad (1993 IBM ThinkPad 730t shown on the right), and there were many more, all tablets operated with a pen. Many of those tablets initially ran the novel PenPoint operating system specifically designed for tablets and use with a pen.

Unfortunately, while there was tremendous hype around those early 1990s tablets, they failed to become a commercial success for a variety of reasons. One was that the technology just wasn't there yet to create a tablet light and functional enough to be of much use. More importantly, supporters of the tablet concept back then took the idea of electronic pen and notepad too literally. Their central thought was that everyone knows how to use a pencil, and everyone knows how to write. So we simply use computing power to translate handwriting into text. The tapping, panning, pinching and zooming of modern tablets simply never entered the equation because a) it wasn't possible back then, and b) people didn't do that on pen and paper pads, so why do it on a computer?

As a result, early 90s tablets completely relied on pens. If the pen failed to work or was lost, there was no touch functionality as a backup. The tablet became useless. Likewise, if the computer failed to understand the handwriting or gestures written on it, which was often the case, it was useless.

That quickly dampened the enthusiasm for tablets, and the fact that Microsoft fought against the new pen-centric operating systems tooth and nail didn't help either. It was a war Microsoft won. Its token-effort "Windows for Pen Computing" prevailed against the far more innovative PenPoint OS.


That did not mean Microsoft had no interest in pens. After the Apple Newton failed in the mid-90s due to its exclusive reliance on handwriting recognition, Microsoft's own Windows CE achieved modest success on small handhelds that primarily used the pen as a mouse replacement. Microsoft then followed up with its 2001/2002 Tablet PC initiative (NEC LitePad shown on the right) that used active pens with significant built-in support from the Windows XP Tablet PC Edition. Handwriting recognition and gestures were available, but hardly played a role at all.

The Microsoft-spec Tablet PC failed because, again, it used the pen simply as a comparatively clumsy mouse replacement on Windows, an OS completely designed for use with a mouse. Plus, it was all too easy to lose the (still expensive) pen, and there was no intuitive finger tapping, panning, pinching and zooming as a backup and complement for the pen. Small wonder that Microsoft itself switched its emphasis to convertible notebooks instead of pure tablets before the "Tablet PC" was even officially launched.

Apologies for the long history here, but it's necessary to understand all this when assessing the Apple Pencil. Has Apple learned from history, or will the Pencil fail because it's making the same mistakes all over?

Has Apple learned from pen history?

So given all that, given that Apple showed the remarkably prescient "Knowledge Navigator" almost 30 years ago, given that Apple had the Newton over 20 years ago, and given that Apple has all the engineering and human interface design expertise in the world, how did Apple implement the new Apple Pencil for the iPad Pro? And what kind of technology did they use to make it work?

The first thing you notice about the Apple Pencil is that it's very long. A full seven inches. Most writing implements are about five inches long although, in fairness, an actual lead pencil is also about seven inches long. Still, it's not clear to me why Apple made the Pencil that long.

Most likely the space is needed for the electronics inside. Apple is very good at miniaturizing everything, but the Apple Pencil is a remarkably complex piece of equipment. The folks at fixit.com tore one down (see here) and found not only a fairly long rechargeable (but non-replaceable) 0.33 watt-hour Li-Ion battery, there's also a full logic board with an ARM Cortex-M3 processor, a Bluetooth chip, a variety of sensors and more. There's an awful lot of tech in there, and so for now Apple perhaps simply needed that much space.

Yes, there's a battery. Which means the Apple Pencil must be charged. This in complete contrast to the active pen technology that has ruled supreme since the early 1990, Wacom. Slender, lightweight Wacom active pens have been around since the very first pen tablets, and they've beaten all competing technologies over the past 20+ years. Microsoft pretty much built its 2002 Tablet PC around the Wacom pen. The image below shows the Apple Pencil and the Wacom pen used in a 2003 Toshiba convertible tablet.

Wacom's success was, for the most part, because the Wacom pen does not need a battery. Every other active pen technology does. Microsoft's N-Trig pens used with its Surface tablets need a battery. Since the battery in the Apple Pencil is non-replaceable, how is it charged? Amazingly and almost unbelievably, this way:

Can anyone say "recipe for disaster?" This has got to be one of Apple's worst ideas ever. It's in the "What were they THINKING?" category. The combination of a fragile Lightning connector and a seven inch lever sticking out from it is virtually guaranteed to eventually result in damage.

Fortunately, the Apple Pencil comes with a tiny little (and thus eminently losable) adapter so you can also charge it via a standard Lighting-to-USB cable.

Now how about two of the other big problems with pens in the past, those being cost and losing the pen? The news is not good here. At US$99, the Apple Pencil costs more than a low-cost tablet and it's certainly not a throw-away item. It's also slippery, doesn't have a clip to attach to a coat pocket (for which it'd be too long anyway), and thanks to the super-slender design of the iPad Pro, there's no garage in the tablet, or any other way to attach the pen to the tablet. Apple should have come up with some sort of solution for that.

Apple Pencil technology

Now what about the technology? Wacom technology works with a powered digitizer grid that can sense the proximity and location of the pen behind the LCD. Resistive pens need actual touch. Capacitive pens in essence emulate a finger tip which changes the capacitance between two electrodes. Experts I have talked to have long thought that a combination between the Wacom (or another active pen technology) and capacitive touch would be the ultimate answer to adding fine detail movement to capacitive multi-touch. But some have recently changed their opinion and now see advanced capacitive pens as the likely winner. An example of the latter technology is the superb capacitive pens that come with some Panasonic tablets.

Apple, however, apparently took a different approach. The Apple Pencil communicates with the tablet via Bluetooth. Which means Bluetooth must be on, with all that that entails. How exactly the Pencil works I don't know yet. I've seen sources that say the Pencil's technology is such that the iPad knows whether a touch is from the Pencil or from a finger, therefore making it possible to process Pencil touch in very different ways. Add to that the sensors present in the Pencil, and things like the much heralded "shading" become possible when the Pencil is held at an angle.

Clearly, wherever there is much tech involved, amazing things can be done, and the Apple Pencil is headed that way. But amazing things generally need a lot of support, and that support will take a while to materialize. As is, the Apple Pencil can be used to tap and pan anywhere, and it works in virtually all applications that accept inking. But the sexy shading requires special support, and for now it's a trial and error process figuring out which apps and tools support what.

If you search for "Apple Pencil Support" in the app store, nothing comes up. If you look for "Apple Pencil apps," a large number show up, but it's not clear how and to what degree the Pencil is supported.

How well does the Apple Pencil work?

How well does the Apple Pencil work in general?

Quite well. Playing with various drawing and sketching apps and exploring the tools with the Apple Pencil is a pleasure. Ink goes on smoothly and elegantly, and it's easy to see what the Pencil can do in the hands of an artist who invests the time to learn all aspects of all the tools. We're certain to see some amazing artwork created with the Apple Pencil.

However, pretty much anything I managed to do with the Apple Pencil I've been able to do for many years with a 20-year-old Wacom pen. Using a Wacom pen on a Microsoft-style Tablet PC delivers wonderfully smooth ink and virtually no lag, same as the Apple Pencil. Those older pen, too, have erasers, pressure sensitivity and they can do amazing tricks. So for now I cannot exactly view the Apple Pencil as a stunning leap forward.

The Apple Pencil hasn't resolved some old pen technology issues. Even Apple claims "virtually" no lag on its own promos, as there still is a bit of lag. The human mind has a very firm expectation of real-time versus lagging response (think of how weird even a slight lip-sync delay is in video and movies), and with rapid movement of the Pencil, there's definitely a sense of delay.

Worse, for me, is the on-off delay in very small movements, like when an artist wants to add minute tiny touches here and there. In the apps I tried that with, there often was a slight delay before the touch registered. What that means is that with a real pencil you absolutely always know what to expect. For now, with the Apple Pencil, not so much.

Finally -- and that's another area where the pen-and-paper metaphor breaks -- when we write on paper with a pencil, we expect the relatively coarse feel of paper. When we pan and swipe on a modern tablet we want as little "stiction" as possible for best operation. Alas -- you can't have both. As is, between the thickness of the glass on the tablet and its super-smooth feel, working with the Pencil feels like writing on a piece of glass. The brain doesn't like that. The brain expects the friction. Likewise, when our eyes look at the Pencil operate from an angle, the brain says, "Wait a minute! I can feel the Pencil touch the surface, but the writing is a fraction of an inch away from the tip!" That's parallax, of course.

Handwriting recognition

It's hard to imagine that there was a time when there was fierce competition in the handwriting recognition software market, with various approaches and technologies seeking to address all sorts of writing needs. At one point I had nearly a dozen of different recognizers working on a Compaq Concerto tablet convertible that served as my main mobile system for a while in the mid-90s. Today, recognition is essentially a non-factor, what with so many more people keyboard-savvy, and onscreen keyboards working so well.

But handwriting recognition software is still available. In fact, the roots of my favorite app, PhatWare's WritePad, go way, way back to the beginnings of handwriting recognition, and the technology has been refined ever since. It works beautifully on the iPad Pro.

Whether or not handwriting recognition will make a comeback is anyone's guess. Between the maturity of the software and the vastly improved capabilities of hardware it certainly works very well. One thing I found distracting is the clicking noise that the hard pen tip on the glass surface of the iPad Pro makes when handwriting. That was never an issue with the soft display surface of the Apple Newton, but such are the pros and cons of various technologies.

Apple Pencil: Too early to tell

Having worked with pens and tablets for 25 years, I want to like the Apple Pencil. I do believe a pen can greatly add to the iPad experience. But for now, and that is so highly unlike anything from Apple, what comes to mind is "the good, the bad, and the ugly." I love how well the Pencil works and the promise it shows. I am not fond of the price, the length, the ease of losing it, the battery, the ungainly looking tip and the uneven app support. And I am baffled that Apple thinks sticking the Pencil into an iPad port to charge it is even remotely a good idea.

So for me, the jury on the Apple Pencil is still out. But I am certainly glad Apple made it available. -- Conrad H. Blickenstorfer, December 2015)

Posted by conradb212 at 06:36 PM | Comments (0)

November 20, 2015

Will the Apple iPad Pro herald an era of "pro" use of tablets?

My iPad Pro came in and I want to share my first impressions, and also my thoughts on where all this is leading.

Anyone who first sees the iPad Pro will immediately notice its size. The iPad Pro is big. 12 x 8.7 inches versus 9.4 x 6.6 inches for the iPad Air 2. So the iPad Pro's footprint is 68% larger than that of the standard iPad. Amazingly though, at 0.27 inches the iPad Pro is barely thicker than the iPad Air 2, which is 0.24 inches thick. And even more amazingly, the big iPad Pro is actually thinner than the sleek and slender iPhone 6S Plus (0.29 inches). In terms of weight, the iPad Pro comes in at a very manageable 1.57 pounds, just barely more than the original iPad, which weighed a pound and a half.


That said, its sheer size remains the iPad Pro's most obvious feature. The picture below shows the (already rather large) 5.5-inch iPhone 6 Plus, the 9.7-inch iPad Air 2, and then the new iPad Pro with a diagonal screen size of 12.9 inches.

What about some of the other display specs? The 5.5-inch iPhone 6S Plus has 1,920 x 1,080 pixel resolution, which means 401 pixels per inch (ppi). The 9.7-inch iPad Air 2 weighs in at 2,048 x 1,536 inches, for 264 ppi. And the new 12.9-inch iPad Pro sports 2,732 x 2,048 pixels for the same 264 ppi. Super-sharp, all of them. And how bright are the displays? The iPhone 6S Plus is around 500 nits, the two iPads both around 450 nits. Plenty enough even for occasional outdoor use.

Setting up the iPad Pro was as simple as it gets. I backed up my iPad Air 2 to my iMac, then went through the setup process for the iPad Pro and selected "restore from backup." That took less than 20 minutes. Then the iPad Pro came up and looked just like my iPad Air 2. But not totally. Only about 2/3s of my apps were loaded, the rest I had to restore from the Apple Store myself. And whatever required a password on the existing iPad required the password again on the iPad Pro. Else, everything was there. Kindle opened the book I was reading on the same page, and every webpage I'd had open on the iPad Air was also open on the iPad Pro when I first launched Safari. This ease of upgrading is something that I always loved about Apple.

I had feared that some apps might look terrible on the larger screen, the way iPhone apps looked terrible on the original iPad which had not only a much larger screen, but also much higher resolution. Since the resolution of the iPad Air 2 and iPad Pro is the same, nothing looks unusual, but many apps look sort of too spread out on the much larger screen.

You do get used to the larger display size very, very quickly. After a couple of hours of checking out apps and just doing my regular iPad routine, the Pro, while still feeling unquestionably large, felt right, whereas the iPad Air suddenly seemed to have shrunk.

But does one really need such a large screen? As is and for now, it seems like a luxury. The vast wealth of existing apps have all been designed and optimized for the 9.7-inch iPad display, so it's not as if using the smaller screen was an inconvenience. Not in the way running classic Windows, designed for a big desktop display, can be a challenge on a small mobile device screen.

Where the larger screen will certainly come in handy is in split screen operation. Splitting the screen on the 9.7-inch iPad made for a lot of eye squinting and panning around. On the iPad Pro, working with two apps feels almost like having two iPads side-by-side. The picture below shows the iPad Pro in landscape mode with Excel and Safari side-by-side. Each half is plenty large enough for comfortable viewing.

Problem is, there aren't many apps that support the full split screen yet. By full I mean 50-50 instead of just 2/3-1/3 (sort of like Microsoft introduced with Windows 8) that many apps are limited to. And sifting through an alphabetically ordered column of apps to pick the one you want in a split part of the screen is hardly the best way to get things done.

Another issue is that apart from the bigger size, there isn't really anything "pro" about the iPad Pro yet. I searched for "iPad Pro" in the Apple store, but couldn't find anything truly seemed to be for the iPad Pro, or take advantage of it. Not yet anyway.

One truly weird thing is that splitting a screen into two is sold and lauded as a marvelous technological advance. What?? For the past 30 years we've had as many resizable windows to work with as we wanted and needed, and now splitting the screen into two is news? No way.

There is, of course, the pen. Sadly, I'll have to wait for mine another three weeks or so, although I ordered it together with my iPad Pro. I did get a chance to play with it in the local Apple store. There's good news and not so good news.

The not-so-good news is much the same that sank the first generation of pen computers in the early 90s and limited adoption of the second generation in the early 00s. The Apple pen (Apple calls it "Pencil") is very expensive (US$99), very large (bigger than a full-size pencil and with an oddly fat tip), and very easy to lose (it's slippery, has no clip, and there's no garage for it on the iPad). Worse, it needs a battery, which here means recharging it via the iPad's port where it looks like it could very easily get damaged. And anything that must be charged will inevitably come up dead when you need it most.

All of the above, and a few other reasons, is why Steve Jobs was adamantly opposed to pens.

The good news, though, is that the pen works very well. Though the one I tried had somehow disconnected itself from its iPad and needed genius intervention, once the pen and tablet talked, the pen worked as smoothly and effortlessly as can be. Pressure results in thicker or thinner lines (hence probably "pencil"), the pen never falls behind, and the much advertised shading when you hold the pen at an angle is marvelous indeed. If all of this sounds the Apple Pencil is mostly for artists, that may or may not be so. I still love to jot quick notes on whatever scrap of paper is around, and with the right app scribbling away on the iPad Pro could be a boon. The question then is whether that's enough of a use to justify a thousand dollar tablet and a hundred dollar pen.

Few remember today that handwriting recognition was the central concept of the first generation of pen computers. The idea was that everyone knows how to write, whereas not that many could type very well. So let's replace that clunky old keyboard where you have to learn the layout of the keys with a simple pen. The computer then recognizes the writing, and no more need for a keyboard, not even onscreen. Sadly, that never really worked out. It worked for a few (including me), but no computer was fast and smart enough to make sense of the kind of careless scribbling most of us commit to paper. And editing via pen was a pain, too, almost as much as editing via voice (which is why pure voice recognition also isn't winning any popularity contests).

But might useful handwriting recognition be a reason to have the Apple Pencil? That's quite possible. Apple owns excellent intellectual property in the form of the "Rosetta" recognition technology of the late Newton MessagePad that became available as "Inkwell" in the Mac OS. Whether or not this will amount to anything on the iPad with its quick and easy tap-editing is anyone's guess.

Final question: what impact will all of this have on rugged tablets? After all, Apple will likely never make a rugged iPad, and although many rugged cases are available for iPads, more than likely Windows and Android will remain the prevalent OS platforms in rugged computing devices.

The answer, I think, is that anything relating to screen size is worth watching. Trends towards smaller and larger display sizes in certain classes of mobile devices have always been quests for strategic advantage as much or more than technological progress (try to sell a phone with a 3-inch screen today!). And with both Microsoft and Apple (let alone Samsung) now using pens, pens may well regain a much more prominent position in mobile devices.

In closing this article, let's not forget that we're still very much in the midst of a software interface quagmire. Most of today's productivity software was created for use with a mouse on a desktop. Yet, it's a mobile world now where touch has replaced the mouse. Unfortunately, while panning, tapping, pinching and zooming work great in media consumption apps, they lack the precision mouse-era productivity software still requires. And that, my friends, is where the word "pro" ought to come into play. It's not so much the size of the screen as it is how to work on it. And that issue remains to be resolved.

Posted by conradb212 at 04:52 PM | Comments (0)

September 18, 2015

What led to the Universal Stylus Initiative

A short while ago I received a press release from the Universal Stylus Initiative. I filed that away in my mind, but got back to it because the concept certainly sounds interesting. Having used, described, tested and compared numerous pen and touch technologies over the past two decades in my work first at Pen Computing Magazine and then at RuggedPCReview, I definitely consider it a relevant and increasingly timely topic (witness Apple's announcement of the iPad Pro with a pen!).

So I spent some time thinking things through and figuring out the need for a universal stylus initiative.

The great appeal of tablets and smartphones, of course, is that they provide tremendous communication and computing capability in small and handy packages that can be taken anywhere. That's in part possible because they don't have physical keyboards that add weight and get in the way. The trade-off, of course, is that without a physical keyboard and a mouse it isn't as easy to enter a lot of data or easily operate the computer.

That's where touch and pens come in. Early tablets (yes, there were tablets in the early 1990s) were called pen computers because that's how they were operated, with active pens. There was touch, too, but it was primarily of the resistive variety that worked best with a pointy stylus. That technology saw its heydays when phones were still dumb and people kept their address books and calendars first on Apple Newtons and then Palms and Pocket PCs.

When Microsoft became interested again in tablets around 2001/2002 (I say "again" because they'd been interested a decade earlier, but primarily to fend off pen-based rivals to Windows) they built the "Tablet PC" around active pen technology. It's called "active" technology because a sensor board behind the LCD detects the precise position of the tip of the pen even when the pen does not actually touch the glass. That's different from "passive" touch technology where a touch is only registered when a finger or stylus touches or depresses the LCD surface.

What are the inherent advantages and disadvantages of active versus passive?

First, active pens make "hovering" possible. That makes it possible for a cursor to follow the pen without actually registering a touch. This way, the user knows where the tablet sees the pen. That allows for very precise operation, just like it is with seeing the cursor when one operates a mouse. Second, active pens can be pressure sensitive. That can be used for 3D-like operation, and is invaluable for artists and designers. Third, active pens can have very high resolution, which makes them quick and very precise, something that's increasingly important on today's super-high resolution displays. On the negative side, active pen technology is fairly expensive. It can be inconvenient to have to first locate the pen before the tablet is operational. And if the pen gets lost, the device may become unusable.

And what about the pros and cons of passive touch technology?

The good thing is that conventional resistive touch doesn't need a special pen. Any cheap stylus will do, as will a fingernail and even firm finger touch. Resistive touch is also fairly precise as long as it's used with a stylus, and it's totally immune to rain or any wetness. For that reason alone, many rugged tablets and handheld computers have been using resistive touch for many years, and are still using it. But passive resistive touch has some significant disadvantages as well. Finger touch alone is very imprecise and unsuitable for operating small user interface components such as scrollers, check boxes and the like. Even when using a passive stylus, there's no cursor to tell you where exactly the touch will be registered. And there's the issue of "palm rejection," i.e. making sure that the device only reacts to the stylus and not only to inadvertent contact via the palm of the user's hand.

The above was roughly the status quo until Apple popularized projected capacitive multi-touch with the iPhone. Procap, or p-cap, as it's commonly referred to, is still passive touch. But it's a far more refined and much more elegant type of passive touch. Instead of pushing down hard enough to register a "touch," p-cap works via "mutual capacitance," i.e. the decrease in capacitance between a sensor electrode and a drive electrode when a finger gets close enough to affect (syphon off, really) the normal capacitance between a pair. This technology only requires a very soft touch, and it's quite precise once a user gets the hang of it. It's also quick because it's electronic rather than physical, and p-cap can easily recognize more than one touch at the time. Apple took advantage of all of the advantages to allow the effortless tapping, panning, pinching and zooming that not only made the iPhone a game changer, but also made p-cap the touch interface of choice for virtually all tablets and handhelds.

However, even the wonderful p-cap technology has its disadvantages. First, the subtle change in capacitance between two electrodes when a finger touches it requires a dry surface. Water, with its great conductivity, tricks the electrodes into false readings. Second, since p-cap also doesn't facilitate "hovering" and the finger touch area is fairly large, p-cap operation isn't nearly as precise as that with a mouse or an active pen. Neither of those advantages was severe enough to keep p-cap from becoming the success it is. They were, however, the primary reason why some tablets and even phablets became available with active pens. And that even though the late Steve Jobs was adamantly opposed to pens.

There is, unfortunately, no getting around the fact that legacy Windows doesn't work well with p-cap. One result of that recognition was Microsoft's bafflingly unfortunate Windows 8 that imposed not-ready-for-primetime touch functionality to all the hundreds of millions or billions using legacy Windows software on the job. Another was that, Jobs' decree notwithstanding, tens of millions bought Samsung's Galaxy Note tablets that combined p-cap with a little Wacom pen, adding precision when needed, and also a handy tool to jot and draw and doodle.

How did all of this affect the industrial and vertical mobile computing markets we cover here at RuggedPCReview? In a number of ways.

While p-cap totally took over on consumer smartphones, it took years for rugged handhelds to switch from resistive touch to p-cap. That's for two reasons.

First, Microsoft simply didn't provide an upgrade path to Windows Mobile, Microsoft's mini-OS that had dominated industrial handhelds for many years. The p-cap-friendly Windows PhoneOS was for consumers, and so Windows Mobile, although at some point renamed Windows Embedded Handheld, became a dead end. With the result that while smartphones charged ahead, vendors of industrial handhelds were stuck with an increasingly obsolete OS. In the consumer smartphone market, Android quickly filled the void left by Microsoft, but industrial and vertical market customers were, and still are, much slower to adopt Android.

Second, vertical market customers often do wear gloves and they often have to work in the rain or where it gets wet and p-cap doesn't work well. Between these two reasons, staying with resistive touch and a passive stylus made sense.

The situation, interestingly, was different with tablets. While the capacitive touch-based iPad was a runaway success that was followed two or three years later with equally successful Android tablets that also use p-cap, Android had a much harder time in industrial and vertical markets. There were a good number of attempts at industrial and enterprise Android tablets, and some saw modest success. But on tablets the pull and advantages of remaining part of the established Windows infrastructure were much stronger, and Windows tablets saw, and see, remarkable success. To the extent where not only the majority of vertical market tablet vendors continue to offer Windows tablets and introduce new ones, but where Microsoft itself is heavily investing into its own Surface tablet hardware.

Which, of course, gets us right back to Windows' weakness with pens. Microsoft's very own tablets use active pens in addition to p-cap, in essence admitting that even when using Windows 10, finger tapping alone just won't get the job done.

So we're sort of back to the same predicament pens had a couple of decades ago. Extra cost, easy to lose, proprietary. You can use any pen or pencil to write on any piece of paper, but a Wacom pen will only work with a Wacom-based tablet, an nTrig pen needs an nTrig tablet, and so on. And none of those proprietary pens could be used on a regular p-cap phone or tablet.

And this, finally, gets me to the Universal Stylus Initiative (USI). USI is a non-profit that was formed in early 2015 specifically with the goal of creating a standard that would allow any active pen to work on any p-cap smartphone, tablet or notebook.

On September 10, 2015, USI announced that their membership had grown to 31 companies, more than doubling from the initial launch in April 2015. Initial membership included such touch/pen heavyweights as Wacom, Atmel, Synaptics, eGalax-eMPIA, Hanvon Pentech, Waltop, as well as electronics industry giants Intel, Sharp, and Lenovo. By now, Asus and LG joined, as well as stylus providers (KYE Systems, Primax, Solid Year Co, Montblanc Simplo), and touch controller providers like Parade Technologies, Silicon Integrated Systems, Raydium Semiconductor and STMicroelectronics International.

This immediately brought up the question as to why any vendor of active pen systems would want to have any part of such a thing. After all, these are technologies heavily covered and protected by iron-clad patents and unassailable intellectual property. And the market for (rather expensive) replacement pens is quite profitable.

A visit to USI's website (see here) answered a few questions but not all, as the specification and most of the background technical information, of course, is only available to USI members.

I was fortunate that USI Chairman Pete Mueller made himself available to a much appreciated call. Pete, whose daytime job is that of principal engineer and senior technologist at Intel, explained that while there is indeed extensive intellectual property in active pen technology, the major players in the field have long seen the potential strategic advantage of providing a degree of interactivity. Talks between them are not unusual as making active pen technology more universally desirable will likely result in a much larger market. (Note that earlier in 2015 Wacom announced a "Universal Pen Framework")

Think about it: there are by now billions of p-cap smartphones and tablets without active pens. Given the increasing call for productivity-enhancing (i.e. creation, analysis, design, etc.) rather than activity-draining (i.e. news, video, silly cat tricks, games) smartphone and tablet technology, the availability of universally compatible pens might be a massive boon. Unlike today where the only option for tablet users are those awful fat-tipped passive pens that are hardly more precise than fingers, a universal active pen could open up entirely new functionality for little or no extra cost.

Atmel's blog quotes Jon Peddie of Jon Peddie Research as saying, "To date the market has been limited by proprietary touch controller-stylus solutions, which limits OEM choices and cost reductions. With the USI specification released, we expect that the capacitive active stylus market will grow from 100 million units in 2015 to 300 million units in 2018, opening up new markets such as smartphones and all-in-one PCs (see quote here).

How does USI intend to make that possible? In their words, "the USI standard defines the communication method by which the stylus sends data about the stylus operation to the smart phone, tablet or notebook PC. The data includes information such as pressure levels, button presses or eraser operation. In addition, USI technology makes use of the existing touch sensor (via a technology called Mutual Capacitance) in smart phones, tablets and notebook PCs, so that the added cost for enabling USI technology on these devices is zero or minimal."

But if we're talking active pens working with capacitive touch controllers, how could those p-cap controllers possibly work with active pens? Pete couldn't go into details on that because much is non-disclosure material, but the general idea that I got was that using a "USI-enabled" pen on a "USI-enabled" device would provide some, but not all of the full functionality of a particular active pen technology.

What does that mean? A look at original USI member Waltop's website provides some clues. It says that they provide both USI-enabled and vendor-specific styli, and that both types "satisfy the performance requirements of Windows 10, such as accuracy, linearity, latency, hovering height, etc. So presumably the USI standard seeks to cover all the mandated basics of using an active pen on a p-cap touch screen, but there are still special capabilities, extra functionality and perhaps higher performance only available through the full proprietary active pen technology. To use one of my beloved automotive analogies, USI provides the road system that allows any street-certified vehicle to get from point A to point B, but if someone has special desires and requirements, they will still want to get a Lexus or a Porsche.

However, Atmel's blog says that Through the same sensor that one’s finger uses to command a device, the stylus communicates via different frequencies to perform the action of writing — writing with up to 2048 different levels of pressure to give the pen-on-paper experience and render thinner or thicker lines in note-taking, painting and doodling, just like an ink pen." That sounds more like some of the proprietary functionality of an active pen system is being brought over into USI-spec passive p-cap controllers.

Poking around the web a bit, it seems like USI systems will be able to differentiate between a USI mode and a proprietary mode, or even switch between the two, depending on which seems appropriate. USI pens apparently will be AAAA battery-powered allowing a slender size and a projected 12-month battery life.

As is, USI hopes to have version 1.0 of their specification done by the end of 2015, and after that we should start seeing active pens that work on any p-cap device that's also compliant with the USI spec. It should be interesting to see what will become available, how well it works, and whether the initiative will take. -- Conrad H. Blickenstorfer, September 2015

Related:

  • Website of the Universal Stylus Initiative
  • The Universal Stylus is Coming (Intel Free Press)
  • Universal stylus to bring easy digital inking to tablets (TechRadar)
  • Early example of a USI pen: Baton US-70

    Posted by conradb212 at 05:47 PM | Comments (0)

    August 27, 2015

    Replacing the Atom N2600

    This morning I received an email from German touchscreen device developer and manufacturer faytech. The company prides itself in its ability to design and engineer high-quality products in Germany, then have them manufactured cost-efficiently in Asia, while providing local service.

    This email, though, wasn't about their latest touchscreen products. It was about the processors they use in their touchscreen panels and PCs. Specifically, it was about replacing the ubiquitous Intel Atom N2600 with the newer Intel Celeron N2807 and J1900. Faytech seemed especially taken with the N2807, which the company has chosen as the new standard for their resistive touchscreen portfolio. They said, "replacing the predecessor Intel Atom N2600, the new processor has it all: speed, stability, lower power consumption, and much better performance per watt ratio." The Celeron J1900, by the way, will be the new go-to chip for faytech's capacitive touch devices.

    That caught my attention. Are the N2807 and J1900 Celerons really the N2600's successor? And if so, why? As is, Intel is making so many different processors and has so many different classifications for them that even those following the industry closely often can't tell them apart or explain why one processor should be chosen over another.

    First, why does the N2600 need replacement? True, the "Cedarview" Atom N2600 was launched well over three years ago, an eternity in Intel's rapid-fire chip development cycle. But it turned out to be an exceptionally good chip.

    A third generation descendent of the Atom N270 that powered tens of millions of netbooks, the N2600 and its slightly faster N2800 sibling were the first Atom chips to use 32nm process technology instead of the older 45nm, making for smaller, more efficient packages. Cedarview processors were dual-core systems whereas before only desktop-oriented Atom versions had two cores. Graphics performance benefitted from a different design and much faster clock speed, resulting in Intel claims of 2X graphics performance compared to the second generation Atoms. And integrated hardware-accelerated video decoding enabled smooth full HD (up to 1080p) video playback, something that was not possible with earlier Atom chips.

    Word on the N2600's qualities got around, and a lot of manufacturers that had been burned by the poky performance of most original Atom chips switched to the N2600. When RuggedPCReview benchmarked a N2600-updated Handheld Algiz 7, we found an overall 3X performance improvement over the earlier Atom Z530-based version. Another example was Motion Computing's upgrade from an Atom Z670 in its original CL900 tablet to the N2600 in its CL910 successor. Again, the performance improvement was substantial (in the 70% range).

    We praised the N2600 as "probably the best general purpose Atom chip so far." N2600 performance was so good that some manufacturers who later upgraded to some of the lower-end Intel "Bay Trail" chips were in for a harsh surprise. For example, RuggedPCReview found a "Bay Trail" E3825-based tablet no faster than its N2600-powered predecessor.

    But that was in 2013, and it's now Fall 2015. The N2600's reign seems to come to an end, and Intel's "Bay Trail" platform is the reason.

    Bay Trail represents a departure from Intel's product strategy of the past few years that differentiated between low end (Atom) and high end (Core) processors. Instead, Bay Trail consists of a large family of single, dual, and quad processor chips optimized for various types of devices. Lower end Bay Trail processors use Intel's "Atom" brand, whereas higher end versions targeting tablets, notebooks and desktops carry Intel's "Celeron" and even "Pentium" brand names.

    Further, for the first time, an Intel Atom microprocessor architecture is paired with genuine Intel graphics. The graphics cores integrated into Bay Trail systems are of the same HD 4000 architecture and variety as those used in Intel's 3rd generation "Ivy Bridge" processors, albeit with fewer execution units (four instead of several times that number) and lower clock speeds. The new graphics support most of the same APIs and features, including DirectX 11, OpenGL 3.x (and even 4.0 in some cases), and OpenCL 1.2. Better yet, some clever power-saving features from 4th generation "Haswell" Core processors seemed to be included as well.

    So it's no surprise that Bay Trail has been a resounding hit. By and large, the impression on the street is that "Bay Trail" is much faster than all those old-style Intel Atom chips, fast enough to do some actual general purpose work, even tough assignments that just may come along on any given day. That includes the kind that totally brought almost every older Atom system to its knees. And it all comes at a cost that's a lot lower than full Intel Core processors. From Intel's vantage point, Bay Trail cut down on the complaints about Atom performance while, despite all the improvements, still being quite a ways behind full Core processor performance and thus no threat for that lucrative Intel market.

    The only problem is that it further increased confusion about Intel's various product lines. Bay Trail chips, while all using an Atom CPU architecture, are sold as Atoms, Celerons and Pentiums. Aren't Celerons gutless entry-level loss-leaders and Pentiums some ancient brand from a distant past? Not anymore, apparently. Or at least it's a different kind of entry level now. Figuring out the difference between all those Bay Trail chips isn't easy. And to make matters more confusing yet, some new Celerons aren't Bay Trail chips at all; they are Intel "Haswell" Core processors (like the Celeron 2980U).

    So what about the Celeron N2807 and J1900 that the good folks at faytech chose to replace the N2600 as the standard in their touch PCs and panels? Let's take a look at the two chips.

    Both of them are based on 22nm lithography instead of the older N2600's 32nm, both use the FCBGA1170 socket, and both use low-power DDR3L RAM. But that's where the similarity stops.

    The J1900, which Intel lists as an embedded desktop processor, is a quad-core chip with 2MB of L2 cache, running at a base frequency of 2GHz and a maximum burst frequency of 2.42GHz. Its thermal design power is 10 watts, it can support up to 8GB of RAM, its base graphics frequency is 688MHz with a top speed of 854MHz.

    The N2807 is listed as an embedded mobile processor. It is a dual-core chip with 1MB of L2 cache, running at a base frequency of 1.58GHz and a maximum burst frequency of 2.16GHz. Its thermal design power is 4.3 watts, it can support up to 4GB of RAM, its base graphics frequency is 313MHz with a top speed of 750MHz.

    For a more detailed spec comparison of the N2600, N2807 and J1900, check this page.

    In terms of raw performance, the J1900 would seem to have a clear advantage over the N2807, even though Intel actually lists the recommended price of the N2807 as higher than that of the J1900 (US$107 vs. US$82). Why is the slower N2807 more expensive? Likely because as a "mobile" chip it includes additional low power modes. It also includes a number of special Intel technologies that the J1900 doesn't have (like Intel Secure Key, Intel Idle States, Intel Smart Connect, and Intel Wireless Display). Unfortunately, even Intel's spec sheets only present an incomplete picture as the sheets are inconsistent. The technically minded will find some more info in the very technical Features and specifications for the Intel® Pentium and Celeron Processor N- and J- Series document.

    What about real world performance? Here we can present some hard data.

    According to RuggedPCReview's database, N2600-based devices scored an average of 476 in the PassMark 6.1 CPU Mark test, 433 in the overall PassMark suite, and 56,070 in the CrystalMark suite. J1900-based devices scored an average of 2,068 in the PassMark 6.1 CPU Mark test, 974 in the overall PassMark suite, and 117,000 in the CrystalMark suite. The sole N2807-based device scored 782 in the PassMark 6.1 CPU Mark test, 570 in the overall PassMark suite, and 83,800 in the CrystalMark suite. Based on these numbers, one might expect a N2807-based system to offer a roughly 1.5X overall performance increase over a N2600-based device, and a J1900-based system a roughly 2.2X overall performance increase over a N2600-based device. And a J1900 system might offer roughly 1.5X overall performance over a N2807-based device.

    So is replacing the venerable N2600 with either one of those Bay Trail chips a good idea? Yes. Time stands still for no one, not even for a good chip like the Atom N2600 of just three or so years ago. But we also believe that given the overall breathtaking pace of progress in the CPU arena, replacements should provide a clear and very noticeable best in performance, and not just an incremental one. So our money would be more on the Celeron J1900 which seems to have all the goods to be a solid, remarkably speedy, yet still economical go-to chip for any number of industrial computing projects where absolutely minimal power consumption is not the highest priority.

    There is, of course, a large number of other processor options from Intel and from other sources. But the x86 world likes standards and things to rely on, and so we've historically seen a flocking to a small number of chips that offer a good overall balance. Chips like the Celeron N2807 and J1900.

    Posted by conradb212 at 04:19 PM | Comments (0)

    April 17, 2015

    Xplore Technologies acquires Motion -- How it came about

    Today I listened to the full Investor Call broadcast Xplore held on April 16 about its acquisition of Motion Computing, and a lot of things are clearer now (listen to it here).

    Motion didn't exactly choose to be acquired, and this was not one of these situations where a big company comes along and makes a financial offer just too sweet to resist. What happened was that Motion found itself in a financial bind caused by third party issues over which Motion had little to no influence over. Specifically, the supplier of the displays used in their Motion C5 and F5 semi-rugged tablets shut down its plants in South Korea without any notice to speak of. This left Motion, which uses a built-to-order model, high and dry and unable to fill C5 and F5 tablet orders, with those two products combining to about half of Motion's sales. With half of its sales essentially on hold, Motion's financial situation quickly went from being quite stable to critical, until their main lender foreclosed.

    This requires some background information. Motion, like most US electronics vendors relies on Asian OEMs (Original Equipment Manufacturers) and ODMs (Original Design Manufacturers) to make its products. There are various nuances to such agreements, but suffice it to say that those OEMs and ODMs likewise rely on their vendors to supply parts. One such part was screens from a company called Hydis. Unfortunately, Hydis' parent company, Taiwanese E Ink saw fit to close two of the Hydis LCD manufacturing plants in South Korea.

    Now one might assume it should be easy to source replacement screens for tablet products that, while quite successful, were not produced in the millions or even hundreds of thousands. It obviously can be done, but it's not easy. There's locating a suitable replacement, there's business arrangements, there's adapting and configuring and testing, all of which takes time, time which isn't available to a company doing build-to-order. Components are changed and re-sourced all the time, but always with proper notice and proper lead time. Apparently, that wasn't the case with E Ink's shutdown of the Hydis plants.

    A bit more about Hydis. They make screens that we here at RuggedPCReview have long considered the very best. Originally launched by Korean Hyundai as Hyundai Display Technology and then part of Hyundai's Hynis Semiconductor, Hydis started working on a wide-viewing angle technology called "fringe field switching" (FFS) in 1996. That came in response to Hitachi launching the IPS display technology, which also provides superior viewing angles. Hydis was so convinced of the merits of their FFS technology that they decided to pretty much bet the farm on FFS. That was understandable as FFS not only provided 180 degree viewing angles from all directions, but also offered terrific contrast ratio, none of the dreaded color shifts that lesser LCDs to this day display when viewed from certain angles, lower power consumption than IPS, and also no "pooling" upon touch.

    Hydis was spun off from Hyundai completely in 2001, just when Microsoft's Tablet PC effort got underway, and Hydis saw a big opportunity to be the dominant player in tablets. I recall that at Pen Computing Magazine we were blown away when we saw Hydis FFS displays in Sharp's Actius TN10W and HP/Compaq's TC1100 notebook convertibles in 2003. The Hydis displays were so much better than anything else that there simply was no comparison.

    Just when things appeared to look bright for Hydis, Hynis sold them off to Chinese LCD manufacturer BOE, and the company became BOE Hydis. Between the Tablet PC never living up to expectations and other issues, BOES Hydis didn't do well and was acquired by Taiwan's Prime View International (PVI) which eventually, in 2010, became E Ink, the folks who pretty much had cornered the market on those paper-like eBook reader displays used by the Kindle and also by Sony. PVI had actually managed to nurture Hydis back to financial health, but did so primarily by selling and licensing Hydis FFS patents. This led to Korean protests that after BEO, E Ink was also simply "asset-stripping" Hydis and thus Korean intellectual accomplishment. Add to that the fact that E Ink fell on hard times itself after the eBook market was severely impacted by the iPad and iPad-class tablets, and it's no surprise that they didn't have the resources to properly fund Hydis.

    That said, and much to Hydis' credit, the company did not rest on its FFS laurels. First came an improved FFS, and then, in 2007, AFFS+, which perfected the technology in numerous respects, including even better outdoor viewability.

    Motion, always on top of technological advances, was an early adopter of Hydis displays, giving them an edge over competition that used lesser LCDs not nearly as well suited for tablets where the ability to view the image on the display from any conceivable angle matters much more than in laptops. The superior quality of the Hydis AFFS+ displays used in Motion's C5 and F5 tablets contributed to their wide acceptance, and continued to do so even in the latest generation of the platform, launched in February 2015.

    Unfortunately, February 2015 was also the month where E Ink suddenly shut two Hydis plants in South Korea. The stated reason were "chronic losses" and "high manufacturing costs." That didn't sit well with the Koreans who felt that E Ink had let the plants become obsolete, on top of simply mining Hydis for its patents. The bottomline for Motion was that they had a very promising new generation of their C5/F5 platform based on Intel's latest 5th generation "Broadwell" chips, and no screens.

    No product, no sale, and the rest is history.

    Enter Xplore. Located within ten miles from one another, the two companies knew each other well. Some of the workforce had worked in both companies, and both certainly also benefitted from the presence of Dell, which although not having any deep business relationships with either Motion or Xplore, made for the presence in Austin of a steady pool of highly qualified technologists.

    But if Hydis displays were so good, and especially well suited for tablets, didn't Xplore use them as well? They did. Xplore, too, was an early Hydis adopter, and that move may well have been what helped Xplore survive and eventually thrive while some of its direct competition did not. Xplore's high-end, ultra-rugged iX104 XC6 tablet has a Hydis screen. So didn't the Hydis shutdown affect Xplore as well? It did, but Xplore had inventory and did not entirely rely on build-to-order. And while Hydis certainly has an impact on Xplore as well, their financial situation was different from Motion's and they were able to not only absorb the blow, but also turn it into an opportunity by taking over a very complementary Motion Computing.

    If there's ever been a better example of making lemonade from lemons, there haven't been many. Had Xplore not been there just ten miles away and a perfectly logical rescuer, Motion would have been liquidated and most likely totally ceased to exist. That would have been a massive blow to Motion's customers and employees, and also to the rugged computing industry in general. As is, Xplore says that "the vast majority" of Motion employees have been extended employment offers.

    So what can we expect from all this? As is, Xplore sales apparently peaked in 2012 with over $100 million, but still were in the 80 million range for 2014. Xplore is a roughly 40 million business. Xplore warns that this doesn't mean that they're suddenly a $140 million business, and that it'll take a year or two until everything has been integrated and ironed out. Xplore Chairman Philip Sassower likened the opportunity as being given the chance to pick up a mansion in distress in a $5 million neighborhood for three quarters of a million. It'll take work and perhaps half a million investment, but then it'll be a $5 million mansion again.

    And then there are the numbers. VDC estimates the 2015 market for rugged tablets as being worth about $585 million. And the 2015 market for rugged laptops about $1.2 billion. And that's on top of a massive number of consumer tablets, a portion of which go into the enterprise that would just love to have some more durable gear. So there's plenty of upside. Xplore is already working on getting suitable replacements for the Hydis screens. And Sassower wants for Xplore to be the #1 in rugged tablets. -- Conrad H. Blickenstorfer, April 17, 2015

    Posted by conradb212 at 07:46 PM | Comments (0)

    Xplore acquires Motion -- what it means

    On April 16, 2015, Xplore Technologies and Motion Computing announced that Xplore was acquiring Motion. This was not a total surprise as both companies are in the rugged tablet computer market, both are pioneers in tablets, and both are located within ten miles from each other in Austin, Texas.

    And yet, the announcement came as a surprise to me. When I had interviewed Motion CEO Peter Poulin in February of this year, Poulin had ended with saying "Motion is in a good position. According to VDC, Motion is the #2 player in rugged tablets, more than twice as large as #3," and he followed up with saying the company had just totally revamped all of their platforms for much greater performance and enhanced wireless communication and ruggedness. And that they had other products in the pipeline. "We're quite optimistic," Poulin concluded. And yet, just a couple of months later, Motion was acquired.

    The move was also a surprise because both Xplore and Motion have shown remarkable resilience from setbacks and challenges throughout their existence. In an era where numerous rugged computing gear manufacturers either folded or were absorbed by either Motorola or Honeywell, Xplore and Motion persevered and remained independent.

    As a privately held company, Motion's business fortunes were closely guarded, but the company's savvy, skills and determination were apparent throughout its history, starting with the unenviable task of taking Microsoft's flawed 2001/2002 Tablet PC initiative and running with it. Though highly publicized and initially supported by most major PC manufacturers, the Tablet PC didn't find widespread acceptance due to high costs and technology that just wasn't quite ready yet. Yet, Motion toughed it out and established for itself a nice niche in enterprise and vertical markets.

    Both Xplore and Motion were especially skillful in recognizing valuable technology advancements early on, and quickly making them available to their customers. Both companies were pioneers in such productivity-enhancing features as dual input where pen and touch worked in harmonious unison, superior outdoor-viewable displays, ergonomics suitable for actual tasks at hand, and the ability of their products to not only hold up in challenging daily use, but also perform at full speed under any operating conditions.

    On the Motion side, the company's early adoption of Intel's Mobile Clinical Assistant (MCA) platform was an impressive example of their unerring compass of what worked and made sense, and what didn't. Motion's C5 MCA -- with its square layout, integrated carry handle, and peripherals placed in the exact right spots -- became a big success, so much so that Motion added an F5 version of the platform for general enterprise and industrial use. Most impressively, while a good dozen other companies also introduced Intel MCA-based tablets, most quickly abandoned them again, lacking Motion's razor-sharp focus on their markets and tablet products.

    Fellow Austin resident Xplore impressed through sheer determination. Time and time again Xplore found new investment as the company's leadership tirelessly presented its case. Which wasn't always easy with what for a long time essentially was a one platform product lineup.

    I well recall first seeing them at a Comdex trade show in Las Vegas in the late 1990s where they had a large, terrific display, a convincing message, and jaw-dropping prototypes that, however, were not quite final yet. That was quite common back in those days, and most of the attempts led nowhere. But Xplore was back the next year, and the year after that.

    When we published the Pen Computing print magazine and did annual Editor's Choice and best product awards, Xplore scored with its impressive GeneSys Maximus. I remember calling Xplore with the good news, and they were disappointed that it was the GeneSys that got the recognition, and not their then semi-secret brand-new iX104. Little did we know that that machine was to become the core engine of Xplore's success and future.

    So why Xplore and Motion got together now, after all those years, I don't know. Business imperatives, I assume, and I am sure it makes perfect sense. But what does it mean looking forward, especially in the light of many such acquisitions that did not work out for the best? In the past we've seen large companies almost mindlessly snapping up much smaller ones. That's not the case here. We've seen fierce competition where one competitor eventually came out on top and annihilated the other. That's not the case here either. So let's see what Xplore and Motion bring to the table.

    Historically, Xplore has been tending to the ultra-rugged tablet market whereas Motion concentrated on a variety of vertical markets that required durable, specially designed and configured tablets. Motion does not have anything that competes with the various versions of Xplore's ultra-rugged iX104 tablets (see here). Xplore doesn't have anything like Motion's C5 and F5 semi-rugged tablets with their integrated handles. Xplore also doesn't have anything like Motion's R12 tablet with its big 12.5-inch screen (see here). So there's no overlap there. And Motion doesn't have anything Android-based, whereas Xplore has its modern, innovative RangerX tablet.

    There is a degree of overlap in just one area, and that's in the promising and potentially quite lucrative area of compact lightweight Windows tablets. That's the tablets for users who do need Windows, but want it in a trendy, sleek and attractive iPad-like design that's tough enough to hold up on the job. For that Xplore has their Bobcat (see here) and Motion has its CL920 (see here). These two, though, are also different enough to be able to co-exist in the short term, the CL920 the unified company's enterprise market tablet entry and the Bobcat for tougher assignments that require more armature and a higher level of sealing.

    Most importantly, there is very little existing customer overlap with these two companies. Xplore has traditionally concentrated on oil & gas, military, government, heavy industry, and similar, whereas Motion is primarily active in healthcare, retail, construction, field service, and so on. If Xplore plays its cards right, it can emerge as a much larger company with a much broader reach, and also perhaps as an example of where 1 + 1, for once, adds up to more than 2. I've said it ever since the consumer tablet boom began, and I'll say it again: with the tablet form factor fully accepted virtually everywhere, there's tremendous opportunity for rugged equipment vendors to step in and successfully provide this desired and contemporary form factor in products that do not break on the job and in the field.

    Overall, this development may also be good news for other independents in the rugged tablet market, companies like Getac, GammaTech, MobileDemand, the Handheld Group, and others: resistance is not futile. Keeping it in the family and preserving the unique, special expertise of the rugged computing industry may well be the best way to success and prosperity.

    -- Conrad H. Blickenstorfer, 4/16/2015

    Posted by conradb212 at 12:05 AM | Comments (0)

    February 10, 2015

    Conversation with Peter Poulin, CEO Motion Computing

    On February 5th I had a chance to speak with Peter Poulin, who was appointed Motion Computing's CEO on December 11, 2014. An industry veteran with more than 25 years of sales, marketing and general management experience in the public and private sectors, the company's press release said Poulin's goal will be to capitalize on the company’s deep mobility expertise and aggressive investments in the design and development of ruggedized tablet platforms and integrated mobility solutions to expand its reach within target vertical markets.

    Over the years, I've been covering Motion's vertical market tablet lineup in some detail, going back to a meeting in San Francisco in 2001 where Motion CEO, Scott Eckert, and founder, David Altounian showed me the prototype of their first tablet. Motion was a startup then, formed to be part of Microsoft's Tablet PC initiative.

    While the overall Tablet PC project was not as successful as Microsoft had hoped, and it would be almost another decade before the iPad finally put tablets on the map, Motion established itself as a provider of enterprise and ruggedized tablets in various niche markets. Motion succeeded where many others failed with their early tablet efforts due to focusing on tablets and tablets alone (Microsoft itself had flip-flopped during the Tablet PC gestation period, switching emphasis from pure tablets to convertible notebooks), and also by displaying an unerring ability to recognize new trends and technologies at an early stage and making them available to their customers.

    One look at Poulin's resume shows that he's uniquely qualified for the job as Motion's CEO. An electrical engineer with a degree from Cornell, Poulin worked for Compaq for 13 years in sales, marketing and management. He then broadened his expertise and horizons with sales and business development stints at such diverse technology companies as Segway, NetBotz, APC, internet solutions providers Hoovers and Virtual Bridges. Poulin joined Motion Computing in July 2012 as VP of marketing and then ascended to CEO.

    Here are some of the highlights of our discussion:

    RuggedPCReview: Microsoft's Tablet PC initiative wasn't a great success and most early tablet providers exited the market within just a couple of years. Except Motion. What made Motion successful with early Microsoft-based tablets where others failed?

    Poulin: The answer is, it's not just about the tablet. It’s really about understanding the customers’ workflow, and integrating the technologies that enable that workflow, of which the tablet is one component. Motion decided early on to focus on a limited number of verticals. In the process we gained a great amount of expertise on how customers use technology. I believe what differentiates Motion is that we have very purpose-built devices that truly make the job easier. An example is the unique keyboard pairing we use with our R12 tablet. It's super-easy and there's none of the frustration users often have with Bluetooth pairing sequences. We know how field service workers work, we know how to build docks that work for them, peripherals that work for them, features that they need. Yes, we seek to grow as a company, but we are careful not to lose that depth and connection to our customers and spread ourselves too thin.

    RuggedPCReview: Ever since the introduction of the iPad, tablets have become a huge success. But the success is primarily in consumer markets and to some extent in the enterprise. Why is that?

    Poulin: We see the tablet as a very successful enterprise tool, and we have the mass consumer adoption of the tablet to thank. However, the consumer and the enterprise have very different needs. For many enterprises and vertical market businesses it's a matter of how to reduce deployment risks. They want to know how they can protect their investment. They need to leverage existing investment. They need to reduce downtime. They need to focus on user work flows. And they know that if their users don't embrace a technology it just won't work. One of our customers, Thames Water, engaged in extensive user testing (200 users) before making a decision. Our Motion F5 was chosen as the clear favorite in user testing with 86% preferring the device over two competing alternatives. Our tablets are replacing a fleet of legacy handhelds to run Thames Water’s SAP and ClickSoftware asset and field management systems. User testing and user acceptance were key elements in Thames decision to choose Motion.

    RuggedPCReview: Over the years, Motion has generally been a pioneer in quickly making new and better technologies available to users. Examples are superior displays, input technologies, the latest processors, new form factors, etc. Is this art of Motion's corporate culture?

    Poulin: Motion has always had a lot of excellent tech people. We have the discipline of big corporation experience, complemented by the agility of startup experience, and that helps us moving fast, being first, being innovative. This has undoubtedly shaped Motion's culture. But I believe we also have a great balance between technical and customer experience. While the tech innovations are most visible, we're also constantly working on details, peripherals, modules, and how to easily make them part of our tablets. That takes a lot of risk out of integration, and our customers appreciate that.

    RuggedPCReview: We currently have this interesting situation where Apple and Android-based tablets almost completely dominate the consumer markets, whereas Microsoft remains strong in tablets designed for enterprise and vertical markets. For now, all of Motion's tablets use Windows. How do you see the Windows versus Android situation?

    Poulin: We watch that situation very, very carefully. I think one difference between consumer and vertical markets is that on the vertical side it all depends on application software, the kind that does all the heavy-duty lifting, and almost all of that runs on Microsoft. Are verticals adopting Android? Yes, to some extent. Some of our customers are trying Android with very narrow apps for certain very specific tasks. The challenge for Android comes with heavier duty apps, development and maintenance cost, and the fact that, for now at least, Android changes so very quickly and older versions are no longer supported. For IT organizations, that cadence of change is painful.

    RuggedPCReview: Microsoft is putting a heavier emphasis on cloud services. Where do you stand on that?

    Poulin: Given the ubiquity and ever-increasing performance and reliability of broadband connections, Motion is paying a lot of attention to cloud-based applications and services. Along with that, security is becoming an ever-greater concern, both in the cloud and also with broadly distributed devices. Motion has long considered security as an integral part of our products and services with TPM, Computrace, multi-factor authentication, etc. In our newly-released F5m and C5m tablets, we're stepping security up by another level with self-encrypting drives.

    RuggedPCReview: While Microsoft certainly still has a huge edge in enterprise and vertical market deployments, there are also challenges as Microsoft attempts to integrate mobile markets into its OS strategy.

    Poulin: Yes, there's certainly a challenge with Windows 8 and 8.1, but overall they're getting bashed a bit too much. Microsoft hasn't done bad, and things are only getting better now. Microsoft is just so eager to get it right that perhaps they moved to catering to consumers a bit too fast, and that can be very disruptive to the enterprise. Then there are the migration issues. Windows 7, Windows 8, Windows 8.1, and soon Windows 10, and they need to support everything. It's not easy to make an OS attractive to consumers as well as corporate customers.

    RuggedPCReview: On the hardware side, Intel has been charging ahead at a very rapid pace with successive generations of Core processors. How difficult and important is it to keep up with Intel?

    Poulin: It's not that complicated on the high end, because the performance levels are there, and have been there for a while. Motion customers do not always want such a rapid pace, so sometimes they skip a generation, and sometimes it's tempting to skip two. It's not so complicated at the low end where it took Intel a while to get up to speed with the Atom platform. That was a bit tough for a while, but they're now sorting that out, and Motion is very confident in the range and predictability of Intel’s product roadmap.

    RuggedPCReview: We can't help but notice that Austin, Texas, seems to be a hotbed for tech development and rugged systems. Dell is there, of course, and Motion, and also Xplore. What makes Austin special?

    Poulin: There's lots of talent in the Austin area. There are lots of big companies and also a vibrant startup community. Somehow it all came together.

    RuggedPCReview: Where, overall, does Motion stand now, and what are the plans for 2015 and beyond?

    Poulin: Motion is in a good position. According to VDC, Motion is the #2 player in rugged tablets, more than twice as large as #3. And we've just totally revamped all of our platforms, the CL-Series, the C5 and F5 Series, and the R12. All have much greater performance, and we also enhanced wireless communication and ruggedness. And we have other products in the pipeline. So we're quite optimistic.

    -- Conrad H. Blickenstorfer, Editor-in-Chief, RuggedPCReview

    Posted by conradb212 at 04:34 PM | Comments (0)

    December 21, 2014

    Storage wars

    Anyone who's ever watched the "Storage Wars" reality TV series on the A&E Network knows that with storage, you never know what you're going to get. That's true for stuff people stow away in storage units, and it's also increasingly true with the kind of storage in our electronic devices.

    There was a time when RAM was RAM and disk was disk, and for the most part the only rule was that more was better. But that was in the era when you could count the total number of available Intel processors on the fingers of a hand or two instead of the roughly one thousand they offer now.

    These days things are much more complicated. And just as often no one seems to quite know for sure why a certain processor was chosen for a particular product, there aren't easy answers why a product uses this type of storage versus that. There often appears to be a disconnected between the engineers that make those often arcane decisions and the marketing side of things that must explain what it's for and why it's better.

    Here are a couple of examples we've recently come across:

    DDR3L vs. LPDDR3 — In mobile computing devices designed to draw as little power as possible you generally not only find "low power" processors (where low power refers to the low amount of electricity use, not the chip's performance), but also DDR3L random access memory. In the case of DDR3L, the L stands for a lower voltage version of the DDR3 memory standard, and more specifically 1.35 Volts instead of the standard 1.5 Volts. That kind of makes sense.

    However, you now increasingly also see devices with LPDDR3 RAM, especially in handhelds and tablets. LPDDR3, it seems, was created specifically for such devices, and their need to be able to go into standby modes where the memory uses as little power as possible. LPDDR3 runs on 1.2 Volts, is part of the main board, and generally uses only about a tenth as much power while in standby as does regular DDR3 RAM.

    Initially I thought LPDDR3 was a lower cost type of memory as we first saw it in lower cost units. But apparently its cost is actually higher than that of standard of even low power DDR3. And it's used in high-end devices such as the Apple MacBook Air.

    SSD vs. eMMC — We've long been seeing substantial differences in benchmark performance between different types of solid state storage. There's generally not much of a difference between the performance of different rotating media, a large difference between solid state storage and rotating media (solid state is much faster), and a very large difference between different types of solid state storage.

    Recently we observed an almost 10:1 difference between the SSD and eMMC mass storage performance in two tablets of roughly the same size, one low end and one higher end. SSD stands for Solid State Disk, eMMC for embedded Multi Media Card. The generic difference between the two, apart from a great variance of flash memory speed itself, is the sophistication and complexity of the controller and interface. eMMC have a basic controller and a relatively slow interface, whereas SSDs use complex controllers and one of the various SATA interfaces.

    There's always an inherent answer to any tech question, it's just that those answers aren't easy to come by. What I'd like to see more of is how something works, what its pros and cons are, and why it's used in certain products.

    Posted by conradb212 at 04:28 PM | Comments (0)

    November 04, 2014

    Intrinsically safe ecom Tab-Ex: another rugged tablet based on Samsung hardware

    This morning I saw in the news that at the IFA Berlin show, ecom instruments launched what the press release called the "world's first Zone 1/Div. 1 tablet computer". New tablets are launched all the time, but I quickly realized that this was relevant for two reasons:

    First, the Zone 1/Div. 1 designation means it's a tablet for use in hazardous locations. Zone 1, in the IEC/ATEX classification system that handles intrinsic safety issues, means the devices can safely be used in areas where there are flammable gasses are likely present. In essence, that requires that there's no chance that the device can generate sparks or other means that could lead to explosions. I'd need to look up what exactly Div. 1 refers to; there are two different entities handling this classifications, the North American National Electric Code (NEC) or the European ATEX directive.

    Intrinsically safe devices, i.e. devices that are incapable of igniting gasses, dust or vapors, are very important in certain deployments, and so this new em tablet will certainly attract attention.

    The second reason why this new em ecom Tab-Ex tablet is relevant is that it's another example of a stock consumer Samsung tablet inside a specially developed case. In May 2014 we took a detailed look at the N4 device from Two Technologies, which is a Samsung Galaxy Note II with a value-added rugged case that also included a second battery and a physical keyboard (see our review here). But whereas the Galaxy Note II is a 5.55-inch "phablet," the ecom tablet is based on the Samsung Galaxy Tab Active with a much larger 8-inch screen. And the Tab Active offers pretty impressive ruggedness specs even without any third party enhancements: It's IP67-sealed, it can handle four-foot drops, and its 400-nits backlight is bright enough to handle outdoor viewing. The Tab Active is a US$700 tablet and you can see its full specs here.

    ecom isn't hiding the fact that their Tab-Ex is based on a Samsung tablet. Even the press release openly states that this type of solution "provides compatibility and a wide range of preloaded applications for a safer work environment, including unparalleled security functions like device encryption, MDM, VPn and secure connectivity (Samsung knOX)." And, perhaps even more importantly, that "being able to offer the same internal tablet, Samsung GAlAXY tab Active, ecom provides a key benefit of consistency in product use - whether Rugged, Zone 2 / Div. 2, Div. 1 or Zone 1. And, the software applications you develop for the Samsung Galaxy Tab Active will work unchanged on the Tab-EX01."

    All of this makes the em Tab-Ex another interesting case in the current discussion on where the rugged computing industry should be moving to best take advantage of the worldwide popularity, and impressively high-tech, of consumer smartphones and tablets. As is, there are vehemently different opinions in the industry. Some feel that it makes perfect sense to pack readily available consumer technology into a value-added case whereas others feel that the guts of a rugged device had to be just as rugged, and the rugged market was inherently incompatible with the 6-month product/technology cycle in the consumer market.

    Below you can see the ecom TabX on the left, and the donor Samsung Tab Active on the right.

    And here are the ecom Tab-Ex product page and the Tab-Ex brochure.

    Posted by conradb212 at 03:20 PM | Comments (0)

    September 29, 2014

    GoPro 4 -- the GoPro phenomenon, how it came about, and why it matters

    On September 29, 2014, GoPro announced the GoPro Hero 4 and also a new basic GoPro camera. This is relevant for anyone dealing with ruggedized computing gear and using technology in the field for a number of reasons. But first, what exactly is GoPro and why do their products matter?

    GoPro makes tiny little action cameras that, somehow, captured the public's imagination and are now found seemingly everywhere. You could almost say they've become a cultural phenomenon. GoPros are mounted on skydiver and motorcycle helmets, on race cars, on skateboards, on boats, on planes and drones, and seemingly on everything else, sometimes several of them. Now you can even mount them on dogs.

    Why did this happen? GoPro cameras aren't particularly attractive — just featureless little boxes — and for several years they weren't very well known either. Initial markets were pretty much limited to what founder Nick Woodman had made the little camera for, surfers. Lore has it that since every surfer wants to be noticed and "go pro," that's where the name GoPro came from. Looks didn't matter as long as the camera could be mounted on a surfer or on a surf board, and as long as it could capture their exploits. Initial GoPros weren't particularly exceptional. Even by 2005, when increasingly capable digital cameras had long been available, the first GoPro was just a cheap Chinese-made plastic film camera. Digital video pioneered in the GoPro in 2006 in a little camera that looked remarkably what GoPros still look tray.

    That, of course, was the era of the Flip camera with its USB port. Flip was wildly successful for a couple of years, but was then, for inexplicable reasons, bought by Cisco of all companies, which did not seem to have any interest in it and simply shut Flip down, apparently seeing no market for small inexpensive vidcams.

    But two things were happening that made all the difference for fledgling GoPro. The first, of course, was the demise of Flip, which left lots of people wanting for a small handy action cam. The second was the technology convergence of immensely powerful new compression technology and inexpensive small storage media. Between those two technologies, it was suddenly possible to record glorious high definition video without the need for a big, bulky tape drive.

    It's probably fair to say that without a company by the name of Ambarella, the GoPro revolution might never have taken place. That's because Ambarella provides the chips and intellectual property to process and compress high definition video so that hours of it can be recorded on inexpensive little SD and microSD storage cards.

    So with a market pining for affordable, yet high-performance action cameras and Nick Woodman having the foresight to pack it all into his mighty little video cubes, the GoPro phenomenon was born. And that despite not even having been the first. RuggedPCReview editors reviewed the Liquid Image VideoMask and the Contour HD 1080p camcorder before, not being surfers, we even knew of the early GoPros.

    Once we did get a hold of a GoPro Hero we published a detailed review entitled "GoPro Hero — the GoPro phenomenon: what the world-beating little 1080p vidcam can (and cannot) do" where we analyzed both the GoPro hardware and its performance in great detail. We were quite impressed with the "terrific HD video" and "stunning value" of the GoPro, but took major issue with GoPro's underwater housing that GoPro claimed was good to depths of 180 feet. The housing probably was, but its curved dome lens kept the camera from focusing underwater. Baffled we wrote in caps "YOU CANNOT USE THE CAMERA UNDERWATER BECAUSE IT WON'T FOCUS?!!?" Baffled was too mild a word, as we had used GoPros to record majestic giant Mantas hundreds of miles off the coast of Mexico in a once-in-a-lifetime trip, and the GoPro video was all blurry.

    We reported in even greater detail on the GoPro Hero2, which still couldn't focus underwater. Apparently, the GoPro people were great surfers, but definitely not divers. The GoPro documentation now said "Please note that due to the curved lens of the HD HERO2 (and the original HD HERO) waterproof housing you will notice a slight loss of sharpness with underwater images." To which we responded in our report: "That is not good enough. It is NOT a slight loss. It makes the camera unusable underwater."

    Our reviews, using third-party underwater housings that DID allow sharp focus, attracted GoPro's attention and we subsequently helped them with testing an underwater housing that worked and also underwater filters that addressed the early GoPros inability to adjust white balance to underwater conditions. Below is a brief video of the fun I had with a GoPro Hero2 and a pride of young sea lions off Coronado island in Mexico.

    The GoPro Hero3 premiered October 2012 with an even better and less bulky underwater housing. The new line, whose body was even more compact than that of the Hero 1 and 2 and used tiny microSD cards, included a basic "White" model that performed at the Hero 1 level, a "Silver" Edition that was like an updated Hero2, and then there was the top-of-the-line "Black" Edition with a powerful Ambarella A7 chip that allowed recording 1080p video at 60 frames per second and, for the first time ever pretty much anywhere, 4K video, albeit only at 15 frames per second. The "Black" Edition could also shoot stills and shoot video simultaneously, and it had selectable white balance settings. We tested these new cameras both above and underwater and also shot many hours of underwater 3D video with GoPro's ingenious system of linking two cameras together.

    About a year later came the Hero3+, a slightly smaller, fine-tuned version of the Hero3 that efficiently addressed a number of Hero3 omissions and shortcomings, such as better audio, close-ups, better low-light performance, etc.

    And now, GoPro announced the Hero4, that includes the top-of-the-line Hero4 Black (US$499) and the Hero4 Silver (US$399), as well as a new basic camera, just called Hero (US$129). The most important features in the new models is that the Hero4 Black can do 4k video at a full 30 frames per second (and also 2.7k/50fps and 1080p/120). This means the camera is now capable of shooting usable 4k video and also high quality slow motion HD video. That could be a major boon to buyers of 4k HDTVs who find out that there really isn't any 4k content. The Hero4 Silver has roughly the same capabilities as the old Hero 3 Black but includes, a first for GoPros, an integrated LCD — prior GoPros could accommodate an optional snap-on LCD, but never came with an integrated one. The new US$129 entry-level Hero can do 1080p/30fps and 720p/60fps and is built directly into a waterproof housing.

    What does that all mean? Overall, with the Hero4 GoPro addresses the criticism that its Hero3/3+ could only do 4k video at a technology demonstration level, as 15 frames per second is essentially useless. The new "Black" can do real 4k video, and the high res slow motion modes are very useful. The Hero4 will likely not only help GoPro to remain wildly successful, but also cement its reputation of being at the bleeding edge of technology. The new "Silver" camera takes roughly the place of the older 3/3+ Black, and with the new basic US$129 Hero, the company has an inexpensive, invulnerable and totally waterproof camera.

    How does it all work? We don't know yet. We've had ample experience with numerous Hero 1, 2 and 3 models, but haven't been able to get any response from GoPro as of late. GoPro is now a public company whose stock premiered at US$24 a share on June 25, and has reached US$90 a share just three months later the morning of the Hero4 release.

    What has made GoPro such a success? Like all phenomena it's a number of things that all came together at the right time. GoPros aren't pretty and they are difficult to use and their battery life is marginal, but they get the job done and then some. No one else thought of offering all the numerous (and inexpensive) mounting hardware that allows GoPros to go along for the ride virtually anywhere. No one else pushed recording of extreme sports as much as GoPro, and all using their signature ultra-wide angle. No other camera fueled the imagination of more videographers that used GoPros everywhere to show new angles and record what could not be recorded before. No one saw the potential of Ambarella's ground-breaking compression technology and inexpensive tiny storage cards as clearly as GoPro. And so here we are.

    What does it mean for users of rugged computing gear? It means that stunningly powerful high-speed, high-definition recording capabilities are available inexpensively and in a form uniquely suited for field operations. In their cases that come standard with every GoPro, the little cameras are virtually indestructible and they can be used and mounted anywhere. Their recording performance is far above that of any camera integrated into any rugged handheld or tablet, and also above that of virtually all smartphones (and without the fear of breaking an expensive smartphone).

    I think it's fair to say that virtually any job out there that benefits from using a rugged notebook, tablet or handheld would also benefit from bringing along a GoPro, or a few of them.

    See GoPro's Hero4 press release

    Posted by conradb212 at 06:14 PM | Comments (0)

    September 16, 2014

    The unpredictable nature of screen sizes

    It's a mad, mad, mad world as far as the screen size of mobile devices goes. Witness...

    For smartphones, 4.7 inches or so now seems the least customers will accept, and 5.5 inches or larger is better. When Apple introduced its iPhone 6 (4.7 inch) and iPhone 6+ (5.5 inch), the demand was such that both Apple's and AT&T's websites couldn't keep up. The AT&T website, in particular, was so messed up from all the pre-orders of giant phones that almost a week after I tried to place my own order, I still have no clue whether the order went through or not.

    Dial back a couple of decades to the dawn of handhelds. The first Apple Newtons and Tandy/Casio Zoomers and such all had displays in the 5-inch range, and all were considered much too big and heavy. Which, of course, they were. So the Palms and Pocket PCs that followed weighed less and had much smaller screens. The standard screen size for an early Microsoft CE-powered Pocket PC, for example, was 3.8 inches, and that was quickly replaced by the 3.5-inch format. Phones weren't smart at that time, and when they did start getting some smarts screens got progressively smaller as the quest was for smaller and smaller phones.

    This drive for the tiniest phones possible had an impact on industrial handhelds, with many switching to 3.5, 3.2 and even 2.8-inch screens, much too small for most serious work.

    What happened when the iPhone and then Android phones came along was that phones stopped being just phones and became computers with very small screens. The screen size issue was first addressed with "apps," software specifically designed for tiny screens, and then, increasingly, with larger screens as more and more customers wanted to do "real" computer work on phones. The lure and tangible benefits of larger screens outweighed the inconvenience of having larger and larger phones, and so now we have phones with screens measuring almost six inches diagonally. Obviously, it's not a trend that can go on.

    Interestingly, tablets and laptops followed different trajectories.

    Laptops initially had very small screens, because the industry didn't know yet how to make larger LCDs. Once that technological hurdle was solved, screens became ever larger, with laptop screens growing to 17 inches. That meant size and bulk and weight and cost. Many attempts at smallish, lightweight "boutique" laptops failed due to cost, until netbooks arrived. Despite their tiny screens they sold in the tens of millions, primarily based on their very low cost. The low cost, unfortunately, also meant low performance, and so customers demanded more speed and larger screens. The industry complied, but once netbooks were large and powerful enough for real work, they cost and weighed more and customers abruptly stopped buying and abandoned the netbook market in favor of tablets or, more specifically, the iPad.

    Interestingly, despite the decreasing costs of large screens, laptops all of a sudden became smaller again. Apple dropped their 17-inch models and is devoting all energy on super light 11 to 13 inch models. And it's rare to see a ruggedized laptop with a screen larger than 15 inches, with most being in the 12 to 14 inch range.

    With tablets, customers don't seem to know what they want. Historically, the first attempt at tablets and pen computers back in the early 90s was all 8 to 10 inch screens, primarily because there weren't any larger displays available. When Microsoft reinvented the tablet with their Tablet PC initiative in 2001/2002, almost all available products were 10 inches, with the 12-inch Toshiba Portege 3500 being the sole deviation, making it look huge. None of the Tablet PC era tablets and convertibles were successful in the consumer markets, though, and that lack of interest didn't change until the iPad arrived.

    The iPad set the standard for a "full-size" tablet with its 9.7-inch display that seemed neither too large nor too small, though many didn't know what to make of Apple sticking with the traditional 4:3 aspect ratio at a time where every display was going "wide." The aspect ratio issue hasn't been resolved as of yet, and screen sizes remain an issue. Initially being unable to get any marketshare from Apple, competitors tried another way by going smaller and cheaper, and suddenly everyone predicted 7-inch tablets were the sweet spot. And the 7-inch class was successful enough to get Apple to issue the iPad mini which, however, never sold nearly as well as the larger original.

    With tablets getting smaller and smartphones larger, the industry came up with the awkward "phablet" moniker, with "phablets" being devices larger than phones but smaller than conventional tablets. But now that phone displays are 5.5 inches and larger, "phablets" have become "phones," and the survival of 7-inch tablets seems in doubt, unless people suddenly decide that a 7-inch "phone" is desirable, in which case the old 1992 EO440 would look prescient rather than absurd.

    As is, no one knows what'll come next in terms of screen size. Even years after the first iPad, tablets with screens larger than 10-inches have not caught on. Phones as we know them simply cannot get much larger or else they won't fit into any pocket. And the size of traditional notebooks will always be linked to the size of a traditional keyboard, the operating system used, battery life, and the cost of it all.

    It should be interesting to see how things develop. And that's not even getting into display resolutions where some phones now exceed the pixel count of giant HD TVs

    Posted by conradb212 at 04:57 PM | Comments (0)

    August 22, 2014

    Why OneNote for Android with handwriting is important

    A few days ago, the Office 365 and OneNote blogs at Microsoft announced OneNote for Android or the addition of handwriting and inking support for OneNote for Android, it wasn't quite clear (see here). While Microsoft OneNote isn't nearly as popular as Word and Excel, it's available as part of Microsoft Office, and supposedly over a billion people use Office. So there may be tens or even hundreds of million who use OneNote.

    What is OneNote? It's sort of a free form doodle pad that can accommodate all sorts of data, from screen clippings to text, to audio, annotations, revisions, doodles, and so on. It can then all be sorted and shared with others. If that sounds a lot like the older Microsoft Journal, there are plenty of similarities.

    Journal goes way, way back to the mid-1990s when a company name aha! introduced it as the aha! InkWriter. The idea behind InkWriter was that that you could arrange and edit handwritten text just like you can manipulate and edit text in a word processor, hence the name InkWriter ink processor. Handwriting and handwriting recognition were big back then, the central concept around which the initial pen computers of the early 1990s were built. As a result, InkWriter could also convert handwriting into text for later editing and polishing. To see how InkWriter worked, see the full manual here and Pen Computing Magazine's 1995 review here.

    Anyway, Microsoft thought enough of it to buy InkWriter and make it available in Windows CE and Pocket PCs for many years. Renamed Journal it's still available in most PCs. In many ways, OneNote looks like a enterprise and business version of Journal, one that's more practical and more powerful. To this day, some users prefer Journal while others prefer OneNote.

    The question may come up why Journal didn't become a major hit once tablets really caught on. Aha! designed InkWriter specifically for pen computers, i.e. tablets. There was, however, one major difference between the tablets of the 1990s and the modern media tablet: all 90's era pen computers had an active digitizer pen well suited for handwriting and drawing. Though no longer centered around handwriting recognition, Microsoft's Tablet PC initiative of 2001/2002 continued reliance on an active pen, enabling super-smooth inking and beautiful calligraphy.

    That still isn't enough to make the Tablet PC a success and it took several more years until the iPad showed that effortless tapping and panning and pinching on a capacitive multi-touch screen was the way to go. The tablet floodgates opened, and the rest is history. Unfortunately, with all the success of modern-era touch screen tablets, one simply cannot annotate a document with fingers, one can't write with fingers, and there's a good reason why fingerprinting is for Kindergarteners and not for serious artists.

    Samsung realized this when it equipped its rather successful Galaxy Note phablets with miniature Wacom-style pens. But since the Wacom active digitizer adds size and cost, it isn't common in modern tablets, and even Windows-based tablets usually refer to those broad-tipped and rather imprecise capacitive pens that are, for the most part, nearly useless.

    But all that may be about to change. The latest advances in capacitive touch enable passive pens with tips as narrow as that of a pencil (see Advances in capacitive touch and passive capacitive pens). That will be a boon to Windows tablets that still have all those tiny check boxes, scrollers, and other interface elements designed decades ago for use with a mouse. And it will be a boon to apps like OneNote (and Journal!).

    And with Android now dominating the tablet market as well, the availability of OneNote for Android with full handwriting/inking capabilities may mean that OneNote will find a much larger user base than it's ever had. That's because all of a sudden, the combination of OneNote and advanced passive capacitive pens will add a whole new dimensions to tablets, making them infinitely more suitable for content creation rather than just consumption.

    Posted by conradb212 at 10:13 PM | Comments (0)

    July 31, 2014

    Android on the desktop!

    Though Android dominates the smartphone market and has a very strong position in tablets, until now Google's OS platform was not available for desktops and conventional notebooks (the Chromebook with its limited offline functionality doesn't really count).

    That has now changed with the new HP SlateBook, a full-function, quad-core Tegra 4-powered notebook with a 14-inch 1920 x 1080 pixel 10-point multi-touch screen, and running Android 4.3. The Slatebook weighs 3.7 pounds, offers up to 9 hours of battery life from its 32 Watt-Hour battery, has USB 3.0 and HDMI ports, 16GB of eMMC storage, a full-size keyboard, and starts at US$429.

    The HP SlateBook is not a rugged product, but its arrival will almost certainly have an impact on the rugged computing industry. In fact, I believe that Android availability on desktops and laptops may change everything.

    Think about it: While Microsoft continues to dominate the desktop and laptop, Redmond's presence in tablets is mostly limited to industrial and enterprise products. And while Windows CE and Windows Mobile/Embedded Handheld have managed to hang on in industrial handhelds for years longer than anyone expected, the handwriting is on the wall there, too. Between not having any upgrade or migration path between WinCE/WinMobile and Windows Phone, and Windows Phone itself just a distant also-ran in smartphones, we'd be very surprised to see any sort of dominant Microsoft presence on industrial handhelds in the future.

    One of the primary reasons why WinCE/WinMobile managed to hang around for so long was the leverage argument: enterprise runs Microsoft and enterprise IT knows Microsoft, so using Microsoft on handhelds means easier integration and less development and training costs. But with all of those hundreds of millions of iOS and Android tablets and smartphones, even Microsoft-based enterprise quickly learned to work with them and integrate them, so the leverage argument isn't what it used to be.

    On the desktop side, it will forever be unclear to me what drove Microsoft to craft Windows 8 as a dual-personality system with the largely useless (for now anyway) metro interface and a crippled legacy desktop that made no one happy. The 8.1 upgrade fixed a few of the worst ideas, but even as a face-saving retreat it was too little to address the basic structural problems. And whatever surfaces as Windows 9 will still have the unenviable task of seeking to perpetuate Microsoft's dominance of the desktop where most people still work, while watching the world go ever more mobile, on non-Microsoft platforms.

    I think that by far Microsoft's strongest argument is that a mouse-operated windowed multi-tasking environment on a large display remains superior for creative and office work to finger-tapping on a tablet. I think the Office argument isn't nearly as strong. Sure, Office has a virtual monopoly in the classic office apps, but perhaps 90% of its functionality is available from numerous other sources. Even as a journalist, writer and publisher I rarely use more than the barest minimum of Office's myriad of functions.

    And though at RuggedPCReview.com we're OS-agnostic, by far our platform of choice is the Mac. The painless OS upgrades alone would be enough to pick Apple, but what really clenches the deal is that when you get a brand-new Mac, you simply hook the Time Machine backup from your old one up to it and transfer everything to the new one, apps, data, settings and all. The next morning, whatever was on the old Mac is on the new one. No fuss at all. That is simply not possible with Microsoft.

    Anyway, I often thought how nice it'd be to load Android on some of the numerous old Windows machines in the office that we can no longer use because the Microsoft OS on them is no longer supported or -- worse -- the system decided it was now pirated because we put in a new disk or processor. I thought how nice it'd be to have Android on the desktop or a good notebook for those times when you simply do need the pin-point precision of a mouse, the solidity of a real desktop keyboard, the comfort of a real desk and office chair, or the ability to see everything on a big 27-inch screen.

    Or how nice it'd be to have exactly the same OS on such a big, comfortable, productive machine as well as on a handy tablet or a smartphone. I'd use that in a heartbeat.

    There are, of course, questions. Microsoft itself has wrestled for decades with ways to provide a unified Windows experience on various platforms, and almost all those attempts failed. Apple didn't fall into the one-OS-for-all-devices trap and instead chose to optimize the user interface to the platform's physical characteristics. But that came at the cost of a rather weak connection between the Mac OS and iOS (why Apple doesn't get more criticism for iTunes and Cloud has always been beyond me). And even if Android were to emerge, full-blown, on laptops and desktops, we'd soon miss having multiple windows open. I mean, flipping between full-screen apps might have been acceptable back in the late 1980s, but going back to that would be a giant leap into the past.

    Still, if full Android were available for desktops today, I'd drop everything I was doing to install it on several on the system in our office right this instance. And I am pretty certain that full Android availability on desktops and laptops would mean a massive renaissance for those currently beleaguered platforms.

    So the release of HP's Android SlateBook just may be a milestone event.

    Posted by conradb212 at 08:00 PM | Comments (0)

    July 25, 2014

    Advances in capacitive touch and passive capacitive pens

    RuggedPCReview.com just finished reviewing the Panasonic Toughpad FZ-M1, and we ended up calling it a "milestone product" because of its novel and unique passive capacitive stylus that can greatly increase productivity because it allows for far greater precision when using the Windows desktop than touch alone or earlier capacitive pens.

    This article describes the past and current situation of touch in rugged tablet computers, and why the technology employed by Panasonic in the FZ-M1 is so relevant.

    Until the iPhone and then iPad popularized capacitive touch, tablets either used active or passive digitizers. Active digitizers used a special pen, with Wacom's electromagnetic system by far the most popular because its slender pen did not need a battery. Passive digitizers were mostly of the resistive variety, and worked with any stylus, piece of plastic or even a finger nail. Neither of these older digitizer technologies helped tablets make much of an impact beyond limited industrial and special market applications.

    Active pens allowed for fairly precise operation, smooth inking, and since the cursor followed the tip of the pen even without the pen touching the display, users always knew where exactly the cursor was, just like with a mouse. But tablets relying on active pens became useless if the pen was misplaced or lost, necessitating the addition of a touch digitizer, which added cost and the need for "palm rejection," i.e. keeping the system from mistaking the reassure of one's palm when writing with the pen as input.

    Passive resistive digitizers are still widely used, but they are a poor match for any user interface designed for operation with a mouse. That's because with them, the touch of a finger or the tip of a stylus is the equivalent of a left mouse click whether or not the point of contact was where the user intended. Resistive touch works well for specially designed applications with large buttons, but relying on it for precise operation, like using a word processor or spreadsheet, can be quite frustrating.

    Apple didn't invent capacitive touch, but the iPhone certainly made a compelling case for effortlessly navigating an operating platform specially designed for the technology, and the iPad really drove the point home a few years later. Tapping, panning, pinching and zooming quickly becomes second nature. To the extent where NOT being able to pan and zoom on a display has become almost a deal-breaker.

    While capacitive multi-touch certainly redefined the way tablets are used, and is largely responsible for the massive success of the tablet form factor, the technology isn't perfect, or at least not yet. It works very well in operating environments designed for it, such as iOS and Android, and with any app where tapping, panning, pinching and zooming gets the job done. But capacitive touch works far less well with software not originally designed for it, software that relies on the precision, small-fractions-of-an-inch movements of the mouse interface. That's why it remains clumsy to use word processors or spreadsheets or graphics apps with capacitive touch -- the precision required to operate them just isn't a good match for something as blunt as the tip of a finger. It's conceivable that a generation growing up with touch, kids who may have never used a mouse, won't see this as a shortcoming, but simply as the way touch works, and there'll undoubtedly be software that'll get the job done.

    As is, we have a situation where hundreds of millions still rely on software that was NOT designed for touch (Windows), and so the need to continue to use that software conflicts with the desire to use state-of-the-art tablet hardware that uses capacitive multi-touch. So why not use a combination of capacitive multi-touch and an active pen, combining the best of both words? That can be done, and is being done. Samsung has sold tens of millions of Galaxy Note "phablets" that combine capacitive multi-touch with a Wacom active digitizer pen (making it by far the most successful pen computer ever sold). Is it a perfect solution? Not totally. The Wacom digitizer adds cost, what you see on the screen tends to lag behind when the pen moves quickly, and things get imprecise around the perimeter. And if the pen gets lost, pen functionality of the device goes with it.

    Whatever the technical issues may be, we've now reached a point where customers pretty much expect capacitive multi-touch even for industrial and vertical market tablets. The tap / pan / pinch / zoom functionality of consumer tablets has just become too pervasive to ignore. So we've been seeing more and more rugged and semi-rugged tablets (as well as handhelds) using capacitive touch. That's no problem with Android-based tablets since Android was designed for capacitive touch but, unlike in the consumer market where iOS and Android dominate, but enterprise customers continue to demand Windows on their tablets. Which means a precise pen or stylus is pretty much mandatory.

    Now what about capacitive pens? They have been around since the early days of the iPad, using a broad rubber tip on a stylus to provide operation a bit more precise than is possible with a finger. How much more precise? That depends. Even slender index finger tips measure more than 10 mm whereas those capacitive styli have tips in the 7-8 mm range. That seems improvement enough for several manufacturers of rugged tablets to include capacitive styli with their products. The tips of those styli are narrower than those of earlier versions, but still in the 5 mm range, and they still have soft, yielding tips. They work a bit better than older ones, but in no way as well as a mouse or an active pen. Not much that can be done, or can it?

    It can.

    When Panasonic sent us a Toughpad FZ-M1 rugged 7-inch Windows tablet for review it came with a full-size plastic pen. I assumed it was a passive stylus and that the little Panasonic tablet had a resistive touch interface in addition to its capacitive multi-touch. But Panasonic clearly described the stylus as "capacitive." In use, the pen with its hard 1.5mm tip was quite precise, far more so than any capacitive stylus I've ever used. It only required a very light touch to operate the classic Windows user interface. Panning was easily possible, and unlike resistive styli that really can't be used for fine artwork or handwriting (they are too jerky), this stylus produced smooth lines. It also never fell behind, one of my pet peeves with Wacom pens. The picture above shows the Panasonic pen (left) sitting next to a couple of older capacitive styli.

    So what was it? I contacted Geoff Walker, RuggedPCReview's former technology editor and now Senior Touch Technologist at Intel, and asked his opinion. Geoff got back to me with a detailed email and said that the Toughpad FZ-M1 apparently was an example of the very latest in projected-capacitive touch controller capability. He explained that over the past couple of years the signal-to-noise ratio of touch controllers has been increased to a point where they can now recognize conductive objects no more than a millimeter thick. Whereas the controllers used in the original iPad may have signal-to-noise ratios of 10:1 or 20:1, the latest controllers can handle 3000:1. He suggested we try other objects on the Toughpad, and I found that a standard plastic stylus doesn't work, but a regular pencil does, as does a metallic stylus or even a metal paper clip. There's no pressure-sensitivity, which is something Wacom pens can do, but that's not really necessary for anything but art.

    But there's more. The Toughpad FZ-M1 also comes with a touch screen mode setting utility.

    Default mode is Pen / Touch, where both fingers and the pen are recognized.

    There's pen-only operation, like when drawing or writing, or whenever you don't want your fingers to inadvertently trigger an action. Geoff said that probably works by measuring the area of touch and ignoring signals above a certain size.

    Next is a glove touch setting. Geoff said this could be done by increasing the sensitivity of the touch controller so that it can recognize a finger even a brief distance away from the screen, as in the distance that the material of a glove adds to the finger's distance from the screen. That appears to be the case as the Toughpad FZ-M1 does work while wearing thin gloves. I thought it was the conductivity of the glove material, but apparently it's the thickness of the glove that matters.

    Then there is a Touch (Water) setting. Capacitive touch and water generally don't get along because, as Geoff explained, water is so conductive that it affects the capacitance between two electrodes, the concept upon which projected capacitive touch is built. What can be done, Geoff said, is switch from the standard mutual capacitance mode to self-capacitance where the capacitance between one electrode and the ground is measured. The capacitive pen doesn't work in this mode because a fairly large touch area is required. I tried this mode with water from a faucet running onto the display. The display still recognized touch and to a very limited extent two-finger operation, but this mode is mostly limited to tapping.

    What does all this mean? For one thing, it means that thick-tipped capacitive styli will soon be a thing of the past, replaced by far more precise passive capacitive pens with tips in the one-millimeter range, like the one available with the Panasonic Toughpad FZ-M1. Second, and I don't know if that's technically possible in Windows or not, but since the touch controller can sense the tip even when it's in close proximity of the screen, "hovering" or "cursor tracking" where a cursors follows the pen even without touching the screen (and thus triggering a left mouse click) may also be implemented. That's one useful thing active styli do that even advanced passive capacitive pens can't do (yet).

    Would all of this still matter if it were not for the need to better support the legacy Windows desktop and its mouse-centric applications? It would. Tablets, hugely popular though they are, are mostly used for content consumption. The emergence of precise pens will open tablets to a much broader range of content creation. -- Conrad H. Blickenstorfer, July 2014

    Posted by conradb212 at 04:14 PM | Comments (0)

    June 30, 2014

    Reporting from the road -- sort of

    As editors of RuggedPCReview.com, we're probably better equipped than most to report anytime, anywhere, and under any conditions. After all, we not only have access to the latest and greatest mobile computing and communications gear, but much of that gear is designed to go just about anywhere.

    The reality is a bit different, as we learned the hard way on a stretch of several weeks on the road and high sea. Testing underwater still and video camera equipment around some of the West Indies islands in the Caribbean, we wished for two things. One was that cell service were truly global, without all the hassles of having to switch SIM cards, sign up for international service, and fear the ever-present phone company gotchas that can result in huge charges on the next monthly bill. Another was that the GPS apps on our various devices were not so completely dependent on cell coverage to work. They all have many gigs of storage, so why not include a reliable and reasonably precise base map that always works, no matter where you are on the globe?

    So while on the good ship Caribbean Explorer II, we were essentially cut off from the world. There are WiFi hotspots in ports, of course, but most are locked and the signal of those that are not was usually insufficient for anything other than frustration and perhaps a Facebook update or two.

    A subsequent extended road trip to the great state of Tennessee promised better coverage. Between a MacBook Pro, two iPads, and two iPhones, keeping in touch and getting work done seemed doable. To some extent anyway. It's amazing what all can be done workwise on a tablet these days but, truth be told, it can take a lot longer to get even simple things done. That's where the MacBook was supposed to come in. The big 15-inch Apple laptop doesn't have WWLAN, though, and so we signed up for AT&T's 5GB US$50 hotspot service on one of the iPads.

    Why paying for the iPad hotspot when there's WiFi coverage virtually everywhere these days? Because that WiFi coverage often isn't there when you need it most, or it doesn't work right, or sitting in a busy Starbucks just isn't the best place to get work done. The hotspot worked fine until, after just three days on the road, a message came up on the iPad saying we had used up the allotted 5GB and it was time to pay up for more. What? Apparently three days of moderate browsing and some work was enough to go through a full 5GB. I have no idea how that bit of activity could use so much data, but it did. That bodes ill for a future where increasingly everything is metered. Data usage is never going to go down, and metered data looks to be an incredible goldmine for the telcos and a massive pain for users.

    In the end, what it all boiled down to was that, yes, we're living in a connected world, but no, you're not really connected all the time and you can't ever rely on having service. And, surprisingly, digital data coverage remains a frustratingly analog experience. There's coverage, maybe, somehow, but it's marginal and you may or may not get a connection, and if you do it may not be good or persistent enough to get actual work done. From that standpoint, it's a mobile world, but one that requires you to be stationary to actually make it work.

    I often tell people that I can work anywhere, anytime, as long as I have internet access. But that's only true to some extent. Tablets and smartphones can only do so much. Even a full-function laptop is not likely to include every utility, file and tool that's on the desktop in the office. "The Cloud" truly cannot be relied on. And neither can data service.

    All of that will change someday, and hopefully someday soon. As is, the connected world is a work in progress.

    Posted by conradb212 at 03:49 PM | Comments (0)

    April 23, 2014

    Unpacking the Xplore iX104 XC6

    So we get this box from Xplore Technologies, and it's pretty heavy. And it's a bit grimy. We figured we better open it outside. This is what happened:

    Yes, Xplore sent us the brand-spaking new iX104 XC6 to make a point. Sod: It can handle grime and dirt. Sunglasses: You can use it in bright sunshine. Measuring tape: You can drop it from seven feet. Ice cube tray: it's freeze-proof. Inflatable pool ring: it can handle full immersion.

    It also has a Haswell processor under the hood. And dual 128GB solid state disks in a RAID 0 arrangement. So equipped and still wet from the hose-down, the big, tough Xplore blasted to the fastest PassMark benchmark we ever recorded. Impressive.

    Posted by conradb212 at 12:48 AM | Comments (0)

    April 07, 2014

    Durabook R8300 -- ghosts of GoBooks past

    There are things in life where the outrage and pain just never seems to go away. For me that includes the infamous game 6 in the 2002 basketball playoffs where the NBA stole the championship from my Sacramento Kings, the forced demise of the Newton, a relationship issue or two, and then there is the way General Dynamics took over that intrepid small town Itronix computer company up in Spokane, Washington, just to then ruin it and shut it down.

    There. Hundreds of people lost their job in Spokane when the corporate bigwigs at General Dynamics moved operations into some unfilled space in one of their buildings in Florida and then shuttered the Spokane facility where some of the most dedicated people in the industry had built rugged computers since the early 1990s.

    But wait... not absolutely everything is lost. You see, like most US computer companies, Itronix had a close working relationship with a Taiwanese OEM/ODM, in this case Twinhead. While the precise detail of the Itronix/Twinhead relationship is only known to insiders, it's safe to assume that there was close interaction between the Itronix designers and engineers up in Spokane, and their Twinhead counterparts in Taipei. This was not a case of a US company just slapping their label on a machine designed and made in Taiwan. It was a close cooperation, and most of the machines sold by Itronix were exclusives, meaning that no one else sold them under their brand and label.

    An example was the Itronix flagship GoBook III, which was replaced by the GoBook XRW, and then, once General Dynamics had inexplicably discarded the hard-earned brand equity in the "GoBook" name after they took over, the GD8000 (shown in picture) and its tech refresh, the GD8200. That machine and Getac's fully rugged notebooks were the Panasonic Toughbooks' primary competition in the rugged notebook market. Precise sales figures are hard to come by in this industry, but by most accounts Itronix had about a 12% marketshare.

    It's been over a year since Itronix was shuttered, but what should suddenly arrive but the GammaTech Durabook R8300. It immediately seemed familiar to me, and a closer look revealed that, yes, it's an updated version of the GD8200. The name alone gives a clue as GammaTech usually names its devices with a combination of letters and numbers centering around display size, like CA10 or S15H. Itronix named their machines by series, and so it was the GD4000, GD6000, and GD8000. The GD8200 refresh may have signified its move to 2nd generation Intel Core processors, in which case the R8300 name could be a combination of R for rugged, and 8300 paying both homage to the machine's origin and the switch to 3rd generation Ivy Bridge processors.

    Be that as it may, its origin and history instantly qualifies the Durabook R8300 as a serious contender in the rugged notebook market. Yes, a 4th gen Intel chip would have been nice, but keeping up with Intel's ever-changing generations isn't the highest priority in a class of machines where longevity and backward compatibility mean more than the very latest specs. As is, the R8300, having the same design and layout as the GD8000 and GD8200, will most likely work with all existing GD-Itronix docks and peripherals, and anyone seeking to replace aging 8000 Series Itronix notebooks should be thrilled.

    So at least some part of the longstanding, fruitful cooperation between Twinhead and Itronix lives on. The GD8200 was a terrific workhorse of a machine, and with the updated tech specs, the Durabook R8300 is certain to be even better.

    Posted by conradb212 at 06:41 PM | Comments (0)

    March 19, 2014

    Getac's latest rugged convertible replaces the V8 with a turbo-4

    I love cars and often use automotive analogies to describe situations. One came to mind recently as we evaluated a particularly interesting new rugged mobile computer. So here goes:

    With gas mileage becoming ever more important in cars and trucks, the automotive industry has been pulling all stops to come up with more fuel efficient vehicles. One way to boost fuel efficiency is to reduce the weight of the vehicle. Another is to use smaller turbocharged motors to burn less fuel while providing the same performance.

    That came to mind when we did a full hands-on test with the new Getac V110 convertible notebook computer. It's very significantly lighter than its two predecessor models. And it uses a processor with less than half the thermal design power than those predecessor models. Yet, it offers close to the same performance and has even longer battery life.

    Here's how this likely came about. While Getac's early convertible notebooks had used low-voltage economical processors, subsequent models became more powerful and required larger batteries. That eventually pushed their weight in the six to seven pound range, quite a bit for devices meant to be carried around and occasionally used as tablets.

    So Getac's engineers went back to the drawing board and designed a new convertible notebook, using the latest space and weight-saving technologies to cut an ounce here and a fraction of a inch there. Most importantly, they switched to low-voltage versions of Intel's 4th generation Core processors that include new and superior power saving features. That allowed them to replace the massive battery of the older models with two small batteries, each no larger than an iPhone. The resulting design, the V110, weighs over two pounds less than the last V200 we tested, and a good pound and a half less than the old V100. The V110 is also much thinner.

    So as for the automotive analogy, Getac replaced a hulking V8-powered SUV with a much svelter, lighter one with a turbo-4.

    But does that work in the real world? For the most part it does. While peak processor performance of the V110 is close to that of the standard-voltage predecessor models, idle power draw is less than half. What that means is that in many routine applications, the re-engineered and much lighter V110 will get the job done on what amounts to half a tank compared to the predecessor models. There's just one area where the automotive analogy breaks down: whereas automotive turbos can suck a good amount of fuel under full load, the V110 remained quite economical even when pushed to the limits.

    Posted by conradb212 at 03:55 PM | Comments (0)

    February 25, 2014

    Samsung Galaxy S5 -- raising the bar

    On February 24, 2014, Samsung showed their new flagship smartphone, the Galaxy S5. It's relevant because it'll sell by the many millions, and it'll be the primary opponent to Apple's current 5s and whatever Apple comes up with next. But the Galaxy S5 is also relevant because the technology it includes and provides will become the new norm. It'll be what people expect in a handheld. And that affects the rugged industrial and vertical markets, because what the Galaxy S5 (and soon many other consumer smartphones) offers is the benchmark, and people won't accept considerably less on the job.

    The Galaxy S5 is essentially a little tablet with phone functionality built in. It measures 5.6 x 2.9 x 0.32 inches and weighs five ounces. That's a lot larger than those tiny phones of a few years ago when the measure of phone sophistication and progress was how small a phone could be made. The S5 is really not pocketable anymore, but because it's so thin it still weighs less than half of what some of the original PDAs weighed.

    The Galaxy S5 screen measures 5.1 inches diagonally, and offers full 1920 x 1080 pixel resolution. That's the same as the current generation of giant flatscreen TVs, making the Galaxy's display razor-sharp, and essentially making a mockery of any claim to "high definition" that a current TV may have. And the screen uses OLED technology, which means no backlight and a perfect viewing angle.

    There are, of course, two cameras. The frontal one has 2mp resolution, which in theory means that vidcam conversations can go on in full 1080p video. The other one has 16mp, not as almost absurdly extreme (42mp) as the one Nokia offers in its flagship, but much higher than Apple, and the same you'd expect from a good dedicated camera. And it can even record in 4k video. If it's in full 30 frames per second, that's something even the vaunted GoPro Black Edition can't, and it'll mean that the vast majority of hyper-sharp 4k video may come from smartphones, and not from some sort of disc player (no 4k standard exists yet), not from streaming video (most internet connections still can't handle that), and not from dedicated cameras (there are hardly any for now).

    Memory, no surprises there. But I should mention once again that the 16 or 32GB of storage built into smartphones such as the new Galaxy S5 more than likely contributed to the impending demise of lower-end dedicated digital cameras which, for some unfathomable reason, still only have 32MB (megabyte, not gigabyte) on board, one thousand times less than a smartphone.

    Communication? The fastest available WiFi (802.11ac) and the latest rev of Bluetooth (4.0) and also NFC, near field communication, sort of a version of RFID. Then there's a fingerprint scanner, and—also new—a pulse scanner than can include heart beat into all sorts of apps and systems. I've long believed that sensors are a field of advance and new capabilities, providing handhelds ever more information about their environment, and thus making them ever more useful.

    And it's all powered by a 2.5GHz quad-core Snapdragon 800 processor that makes older processor technology look like cast iron flathead six designs from the 40s compared to a computer-controlled modern turbo with variable valve timing.

    Setting the bar even higher, this new Galaxy phone carries IP67 sealing, which means it's totally dust-proof, and also waterproof to the extent where it can survive full immersion into water down to about three feet. IP67 has long been the holy grail of rugged computer sealing, and now Samsung offers it in a consumer smartphone that's an eighth of an inch thick.

    I think it's fair to say that the bar has been raised. Dedicated industrial and professional tools will always do this thing better or that, have better dedicated components such as scanners, and they can, overall, be much tougher than any eighth-inch smartphone can ever be. But this level of technology so freely available to millions is simply certain to have an impact on expectations.

    Posted by conradb212 at 03:48 PM | Comments (0)

    February 18, 2014

    Android parlor trick

    Just a brief entry here....

    Up to Android "Jelly Bean," i.e. versions 4.1.x through versions 4.3.x, one of the cool things about Android was the (relative) ease with which one could do screen grabs. Those, of course, are essential to product reviewers. And so it was good to know that all one had to do was connect the Android device to a PC or Mac, fire up the Android SDK, click Applications > Development > USB debugging on, and grab those screens.

    That's what I wanted to do when I recently upgraded an Android tablet from "Ice Cream Sandwich" to "Jelly Bean." Unfortunately, Applications > Development > USB debugging was no longer there, and there seemed nothing else that would allow access to the debugging mode. Google to the rescue.

    Well, turns out that getting into Android debugging mode now involves a secret handshake. You go to About Tablet, then click on Build SEVEN TIMES. That enables the Developer options menu, where you need to click on USB Debugging. That's about as non-obvious as it gets, and probably reflects Google's efforts to keep all those hundreds of millions of Android users from hurting themselves by accidentally disabling their device.

    That probably makes sense. I still believe one of the reasons why Linux never really made it big as an OS for the masses is because its creators insisted to leave the arcane technical underbelly more or less visible to all. As Android matures, Google can't allow that to happen. Just like "View Page Source" has now vanished from easy view on all major browsers.

    But it's a good party trick. Next time you see a techie at a cocktail party or else, test him or her by innocently asking "How do I get into debug mode under Jelly Bean??"

    Posted by conradb212 at 05:54 PM | Comments (0)

    January 12, 2014

    More 4k video contemplations

    All of a sudden, everyone is talking about 4k video. Also known as Ultra-HD video, four times the resolution of the 1920 x 1080 pixel 1080p standard, 4k was everywhere at the Consumer Electronics Show in Las Vegas. Now, obviously, 4k video isn't the most important thing on rugged mobile computer manufacturers' minds, but 4k video is nonetheless a sign of changing times. And with some consumer smartphones already offering full 1080p resolution on their small screens, and consumer tablets going well beyond that, it's only a matter of time until rugged and industrial market customers, too, will demand much higher resolution in their professional computing gear. So keeping track of what's happening out there with ultra-high resolution makes sense.

    As I stated in an earlier essay on high resolution (Thoughts about display resolutions, December 2013), I recently purchased a 39-inch Seiki 4k flatscreen display that can be used as a monitor or as a TV. It was an impulse buy, and I justified the (remarkably reasonable) price by deciding the Seiki was a research investment that would help us here at RuggedPCReview.com learn more about how 4k video worked, what was possible, and what wasn't.

    On the surface, 4k video makes an awful lot of sense. 1080p HD video was just perfect six or seven years ago for the emerging flood of 40-50 inch flatscreens. But flatscreens have grown since then, and 65, 70 and even 80 inches are now the norm. As you can imagine, the same old 1080p video doesn't look nearly as sharp on screens with two, three, or four times the real estate, or more. So doubling the resolution in both directions makes perfect sense.

    And it's a great opportunity to infuse new life and excitement into the flatscreen TV market. Three years ago, everyone offered 3D TVs. An amazing number were sold, amazing given that there was virtually no 3D content. And amazing considering one had to wear those annoying 3D glasses. So 3D quietly became just a built-in feature in most new TVs, but it's no longer a selling point. 4k video, on the other hand, IS now a selling point. And it'll become even bigger as time moves on.

    The problem, though, is the same as it was with HD, initially, and then with 3D: no content. There is no native 4k video standard for storage or players. There are no 4k players and only a very few recorders. none of which is mature at this point.

    So we did some testing to see what's possible, and what's not possible. The goal was to see whether it's actually possible to get 4k video without breaking the bank. So here's what we did, and how it worked out so far.

    What can you do today with a 4k display such as our 39-inch Seiki? Well, you can watch regular HD TV on it via your satellite or cable setup. You can hook it up to video game consoles. You can connect it to streaming video gizmos like Apple TV, Google Chromecast, or the various Roku type of devices. Or you can connect a tablet, notebook or desktop to it. Sadly, almost none of these support resolutions higher than 1080p. Which means that on a 4k display you get video that may or may not look better than on a regular 1080p display. May or may not, because some devices do a decent job at "upscaling" the lower res. For the most part, though, displaying 1080p content on a 4k screen is a bit like running early iOS apps in "2X" mode, where each pixel was doubled in each direction. That's not impressive.

    What about hooking up a notebook or desktop to the 4k screen? Well, none of the various computers around our offices supported more than 1080p. And the one Windows desktop I use most often for testing is actually a rather old HP with just a 2.2GHz Core 2 Duo processor. That's still good enough to run Windows 8.1 at a decent clip, but the video from the HP's Asus motherboard maxed out at 1680 x 1050 pixel. So it was time to look around for a video card that could actually drive 4k video.

    That, my friends, was a sobering experience for me as I realized how little I knew of current video card technology. Sure, we cover whatever Intel bakes into its latest generation of Core processors, and have a degree of familiarity with some of the discrete graphics subsystems available for various rugged notebooks. But beyond that there's an incredibly complex world of dedicated graphics chips, interface standards, different connectors, as well as an endless array of very specialized graphics features and standards they may or may not support.

    I am, I must admit, a bit of a gamer and love spending some relaxing hours playing video games on the Sony and Microsoft consoles. A particular favorite of mine is Skyrim, and so I bought a copy for the PC, to see what it would look like on the Seiki 4k screen. Well, initially I couldn't get the game to work at all on the old HP desktop as its motherboard video didn't support one feature or another. Now it was definitely time to invest in a graphics card.

    Several hours of Googling and reading up on things yielded only a vague idea on what might be a feasible solution to our video issue. You can, you see, easily pay more for a sophisticated video card than you pay for an entire computer. And some of those cards need two expansion slots, have large fans, and require massive power supplies to even run in a system. That was out. Was it even possible to get a decent video card that would actually drive 4k video AND work in an old PC like our HP?

    The next pitfall was that on Amazon and eBay you really never know if something is the latest technology, or old stuff from a few years ago. Vendors happily peddle old stuff at the full old list price and it's all too easy to get sucked in if you are not totally up to speed. So always check the date of the oldest review.

    What eventually worked best was checking some of the tech sites for recent new video chip intros. nVidia and AMD have practically the entire market, and are locked in fierce competition. The actual cards may come from other sources, but they will use nVidia or AMD chips. a bit more research showed that AMD had recently introduced Radeon R7 chips for reasonably priced graphics cards, and those cards actually appeared to support 4k video. And use the PCI Express x16 slot that my old desktop had. I did truly not know if that was the same connector and standard (almost every new Intel chip uses a different socket), but it looked the same, and so I ordered a Gigabyte Radeon R7 250 card with a gigabyte of GDDR5 memory on Amazon for amazingly affordable price of US$89, and no-cost 2-day shipping with Amazon Prime.

    The card promptly arrived. And it fit into the x16 slot in the old HP. And the HP booted right up, recognized the card, installed the AMD drivers from the supplied DVD, and did nice 1680 x 1050 video via the card's DVI port on the vintage 22-inch HP flatscreen that had come with the PC. Skyrim now ran just fine.

    So it was time to see if the Radeon card would play ball with the Seiki 4k screen. Using a standard HDMI cable, I connected the old HP to the Seiki and, bingo, 4k video came right up. 3840 x 2160 pixel. Wow. It worked.

    Windows, of course, even Windows 8.1, isn't exactly a champion in adapting to different screen resolutions, and so it took some messing around with control panels and settings to get the fonts and icons looking reasonably good. And while I have been using a 27-inch iMac for years as my main workstation, 39-inch seems weirdly large. You'd watch a TV this size from a good distance, but for PC work, you're real close and it doesn't feel quite right.

    So now that we had an actual 4k video connection to a 4k display, it was time to look for some 4k content. YouTube has some (search "4k video demos"), and so we tried that. The problem there was that running it requires substantial bandwidth, and our solution—a LAN cable connected to a power line connector to run the signals through the building wiring, and then to our AT&T broadband—apparently wasn't up to snuff. So we saw some impressive demo video, very sharp, but more often than not it stopped or bogged down to very low frame rates. So bandwidth will be an issue.

    We also perused pictures in full resolution. 4k is over 8 megapixel in camera speak, and so you can view 8mp pictures in full resolution. The result, while quite stunning from a distance, is actually a little disappointing from up close. The JPEG compression that is usually hardly noticeable on smaller screens is obvious, and then there's the fact that even 4k resolution on a 39-inch screen isn't all that much. It's really just in the pixel density range of an older XGA (1024 x 768) 10-inch tablet, and those have long been left behind by "retina" and other much higher resolution screens.

    Then we cranked up the Skyrim game and ran that full-screen. It looked rather spectacular, though probably ran in 1080p mode because full 4k would require a good deal more video RAM and modified textures to really be in full 4k resolution. It did look good, but it also bogged down.

    After an hour of playing with the 4k video setup I began feeling a bit nauseous and had to stop. The reason, apart from not being used to the large screen from so close up, was almost certainly a serious limitation of the 39-inch Seiki—it runs 4k video at a refresh rate of only 30 Hertz. That is very low. Most modern computer displays run at 65-85 Hertz. You don't actually see flickering, but I am convinced the brain has to do some extra work to make sense of such a low frame rate. And that can make you nauseous.

    One of the original triggers of my impulse decision to get the 4k Seiki screen was to watch 4k video from our GoPro 3 Black cameras. The GoPro's 4k, unfortunately, is really also just a technology demonstration for now, as it runs at just 15 frames per second. Normal video viewing is at 30 fps, and games and other video may run at much higher frame rates yet. So until we can view 4k video at 30 fps and more, it's just an experiment.

    So that's where we stand with 4k video. There's a vast discrepancy between the marking rhetoric that's pushing 4k now, and the fact that there's almost no content. And significant technical barriers in terms of frame rates, bandwidth, standards, and existing hardware and software that just can't handle it. It's a bit like Nokia trying two tell people you need a 41mp camera in a phone when the phone itself can display less than a single megapixel, and it would take more than four 4k screens in a matrix to display a single picture in full resolution.

    In summary, I did not go into great detail in my investigations, and there will be many out there who have much more detailed knowledge of all those standards and display technologies. But we did take a common sense look at what 4k can and cannot offer today. The following about describes the situation:

    - 4k displays will soon be common. Dell already offers a range of affordable models (like this one).
    - 4k video support is rapidly becoming available as well, and 4k video cards starts at well under $100.
    - Some Intel Haswell chips offer integrated 4k video support.
    - There's virtually no 4k content available.
    - 4k uses more bandwidth than many current linkups can supply.
    - The 4k experience is in its infancy, with insufficient refresh rates and frame rates.
    - BUT it's clearly the future.
    - 4k rugged displays and signage systems are rapidly becoming available.
    - Do spend the time learning what it all means, and how it fits together.

    Posted by conradb212 at 01:02 AM | Comments (0)

    December 26, 2013

    Does your Pentium have an Atom engine?

    There was a time in the very distant computing past where, when buying a computer, the sole decision you needed to make was whether to use the Intel 386/33 or save a few bucks and get the slightly slower 386/25. Today, if you use Intel's handy ARK app that lists every product available from Intel, there's a staggering 1,874 different processors listed. That includes processors targeted at desktop, server, mobile and embedded computers, but even if you leave out servers and desktops, there's still the choice of 949 processors for mobile and embedded applications. Not all of them are state-of-the-art, but even chips designated as "legacy" or "previous generation" are still in widespread use, and available in products still being sold.

    The mind-blowing number of Intel processors available brings up the question of how many different Intel chips the world really needs. As of the end of 2013, Apple has sold about 700 million iPhones, iPads and iPhone Touch devices that made do with a mere dozen "A-Series" chips. Not too long ago, tens of millions of netbooks all used the same Intel Atom chip (the N270). So why does Intel make so many different chips? Even though many are based on the same microarchitectures, it can't be simple or cost-efficient to offer THAT wide a product lineup.

    On the customer side, this proliferation serves no real purpose. End users make almost all purchasing decisions based on price. Figure in desired screen and disk sizes, and whether there's an Atom, Celeron, Pentium, Core i3, Core i5, or Core i7 chip is inside is confusing at best. For hardware manufacturers it's worse, as they must deal with very rapid product cycles, with customers both demanding legacy support AND availability of the latest Intel products. THEY must explain why this year's Intel crop is so much better than last year's which was so much better than what Intel offered the year before. Or which of a dozen of roughly identical Intel chips makes the most sense.

    As is, Intel has managed to bewilder just about anyone with their baffling proliferation of processors, and without the benefit of having established true brand identities. What Intel might have had in mind was kind of a "good, better, best" thing with their Core i3, i5 and i7 processors, where i3 was bare-bones, i5 added Turbo mode and some goodies, and i7 was top-of-the-line. But that never really worked, and the disastrous idea to then come up with a generation-based system that automatically made last year's "generation" obsolete only adds to the confusion. And let's not even get into Intel "code names."

    Atom processors were supposed to provide a less expensive alternative to the increasingly pricey Core chips—increasingly pricey at a time where the overall cost of computers became ever lower. Unfortunately, Intel took the same approach with Atom as Microsoft had taken with Windows CE—keep the line so wingclipped and unattractive that it would not threaten sales of the far more profitable Windows proper and mainline Intel chips. At RuggedPCReview we deeply feel for the many vertical and industrial market hardware and systems manufacturers who drank Intel's Atom CoolAid just to see those Atom processors underperform and in quick need of replacement with whatever Intel cooked up next.

    But that unfortunate duality between attractively priced but mostly inadequate entry level Atom chips and the much more lucrative mainline Core chips wasn't, and isn't, all. Much to almost everyone's surprise, the Celeron and Pentium brands also continued to be used. Pentium goes back to circa 1990 when Intel needed a trademarkable term for its new processors, having gotten tired of everyone else also making "386" and "486" processors. "Celeron" came about a few years later, around 1998, when Intel realized it was losing the lower end market to generic x86 chipmakers. So along came Celeron, mostly wingclipped versions of Pentiums.

    Confusingly, the Celerons and Pentiums continued to hang around when Intel introduced its "Core" processors. The message then became that Pentiums and Celerons were for those who wouldn't spring for a real Core Duo or Core 2 Duo processor. Even more curiously, Pentiums and Celerons still continued when the first generation of the "modern era" Core processors arrived, and then the second, third and forth generation. Study of spec sheets suggested that some of those Pentiums and Celerons were what one might have called Core i1 and i2 chips, solutions for when costs really needed to be contained to the max. In some cases it seemed that Intel secretly continued its long-standing tradition of simply turning off features that were really already part of those dies and chips. A weird outgrowth of that strategy were the last-ditch life support efforts of the dying netbook market to answer the calls for better performance by replacing Atom processors in favor of Celerons that were really slightly throttled Core i3 processors. That actually worked (we have one of those final netbooks in the RuggedPCReview office, an Acer Aspire One 756, and it's a very good performer), but it was too little, too late for netbooks, especially against the incoming tide of tablets.

    Given that the choice of mass storage, the quality of drivers, keeping one's computer clean of performance-zapping gunk and, most of all, the speed of one's internet connection (what with all the back and forth with The Cloud) seems to have a far greater impact on perceived system performance than whatever Intel chip sits in the machine, it's more than curious that Celeron and Pentium are not only hanging around, but have even been given yet another lease on life, and this one more confusing than ever.

    That's because Intel's latest Atom chips, depending on what they are targeted at, may now also be called Celerons and Pentiums. It's true. "Bay Trail" chips with the new Atom Silvermont micro architecture will be sold under the Atom, Celeron and Pentium brand names, depending on markets and chip configurations. Pontiac once took a heavy hit when the public discovered that some of their cars had Chevy engines in them. Pontiac is long gone now, and General Motors has ditched other brands as well, realizing that confusing consumers with too many choices made little sense. Even GM, however, didn't have anywhere near the dominance of their market as Intel has of its market.

    Where will it all lead? No one knows. Intel still enjoys record profits, and other than the growing competition from ARM there seems little reason to change. On the other hand, if the current product strategy continues, four years from now we may have 8th generation Core processors and 4,000 different Intel chips, which cannot possible be feasible. And we really feel for the rugged hardware companies we cover. They are practically forced to use all those chips, even though everyone knows that some are inadequate and most will quickly be replaced.

    PS: Interestingly, I do all of my production work at RuggedPCReview.com on an Apple iMac27 powered by a lowly Core 2 Duo processor. It's plenty fast enough to handle extreme workloads and extensive multitasking.

    Posted by conradb212 at 06:35 PM | Comments (0)

    December 13, 2013

    Michael Dell's keynote at Dell World 2013: reaching for the cloud

    One big problem with being a public company is that every three months it's imperative not to disappoint analysts and investors. Dell won't have to worry about that anymore because it returned to being a private company. That means Dell can now take the longer look, pursue the bigger picture, and no longer suffer from the infliction of short term thinking, as Michael Dell so eloquently put it in his keynote address at the 2013 Dell World conference in Austin, Texas.

    And that was the core of Michael Dell's message, that as a private company Dell now has the freedom to make the bold moves that are necessary, investing in emerging markets, and take a long-term view of their investments. Dell said he senses a new vibe of excitement at the company that bears his name, and that he feels like he is part of the world's largest startup.

    He did, of course, also mention that sales were up in the double digits, that 98% of the Fortune 500 are Dell customers, that Dell has established a new Research Division and also the Dell Venture Fund. Dell reminisced how he had been taking an IBM PC apart in his dorm room, found that many of the components weren't actually by IBM but steeply marked-up 3rd party components, and how he felt he could make that technology available cheaper and better. He expounded on a recurring theme in his keynote, that the proliferation of computer technology between roughly 1985 and 2005 also saw poverty in the world cut by half.

    Dell showed an Apple-esque commercial that will run in 2014 and plays homage to all the little companies that started with Dell, including the one in dorm room #2714 (Dell itself, founded on a thousand bucks). Nicely done and charming. He spoke of how the future is the move beyond products and to end-to-end scalable solutions. He spoke of $13 billion invested in research, highlighted the Dell PowerEdge servers that are in the top two in the world, demonstrated, right on stage, how integrated server, storage and network functionality in fluid cache technology clocked in at over 5 million iops (input output operations per second).

    Dell spoke of their new mantra, to transform, connect, inform, and protect. Transform as in modernize, migrate and transition to the future. Connect as in connecting all the various computing devices out there, including the Dell Wyse virtual clients, because, still and for a good time to come, "the PC, for a lot of organizations, is how business is done." Inform, as in turning data into useful, productivity enhancing results, making companies run better with data analytics. And protect as in offering next gen firewalls and connected security to keep data safe and ward off attacks before they even happen.

    Dell also reminded that the company has been the leader in displays for over a decade, and touched on 4k2k video resolution that is available from Dell now, another example of Dell making disruptive technology available at accessible prices.

    Dell then introduced Elon Musk, who had driven in in a red Tesla Model S, his company's groundbreaking electric car. Along came David Kirkpatrick who took on the job of engaging Musk interview style. Musk, of course, also pioneered PayPal, and, in addition to Tesla, runs SpaceX, sending rockets into space. Musk was there, however, not so much to discuss technology, but to illustrate the impact of pursuing the big picture, big ideas, things that simply need to get done. As if on cue, Dell rejoined the conversation, congratulating Musk and bemoaning the infliction of short term thinking that hamstrings progress and big ideas and bold bets.

    Musk said this must be the best time in human history, where "information equality" breaks down barriers, making the world a safer, better place, meshing with Dell's clear belief that technology is a boon for mankind. Today, Musk said, anyone with internet access has more information available than the very President of the United States had just 30 short years ago. Musk also expressed regret over a pessimistic, negatively biased media that always seems to seek the negative.

    The address concluded with an observation by Kirkpatrick about Tesla cars' constant connection with the Tesla's computers, and how that information feedback and back and forth is used to make the cars better. Just as, Dell said, his company's approach with customers, a constant back and forth, and constant feedback in their quest to improve and innovate.

    This 75 minute keynote was a bold move with a big picture and a big message, with Dell, Musk and Kirkpatrick almost like actors in a play. Quite a task to pull this off, but the message was loud and clear and finely tuned: Dell is now free to pursue big ideas. Like Elon Musk with his electric car and rockets shooting into space, Dell can now reach for the cloud(s), and beyond.

    Posted by conradb212 at 02:57 PM | Comments (0)

    December 05, 2013

    Thoughts about display resolutions

    The resolution of computer displays is an interesting thing. There are now handhelds with the same number of pixels as large flatscreen TVs, Apple claims its "retina" displays are so sharp that the human eye can no longer see individual pixels, and the very term "high definition" is in the process of being redefined. Let's see what happened with display resolution and where things are headed, both in handhelds and in larger systems, and what 4k2k is all about.

    Color monitors more or less started with the original IBM PC's 320 x 240 pixel resolution. In 1984, the monochrome Hercules video card provided 720 x 350 pixel for IBM compatibles. For a long time IBM's 640 x 480 VGA, introduced 1987 with their PS/2 computers, was considered a high resolution standard for the desktop and for notebooks. Then came 800 x 600 pixel SVGA and 1024 x 768 pixel XGA, and those two hung around for a good decade and a half in everything from desktops to notebooks to Tablet PCs. Occasionally there were higher resolutions or displays with aspect ratios different from the 4:3 format that'd been used since the first IBM PC, but those often suffered from lack of driver and software support, and so pretty much everyone stayed with the mainstream formats.

    It was really HDTV that drove the next advance in computer displays. During a 1998 factory tour at Sharp in Japan I had my first experience with a wide-format TV. It looked rather odd to me in my hotel room, sort of too wide and not tall enough and not really a TV, but, of course, that turned out where things were going. In the US, it would take a few more years until the advent of HDTV brought on wide-format, and terms such as 720p and 1080p entered our tech vocabulary. For a good while, smaller and less expensive flatscreen TVs used the 1280 x 720, or "720p," format, while the larger and higher end models used full 1920 x 1080 pixel "1080p" resolution. That meant a wide 16:9 aspect ratio.

    Desktop and notebook displays quickly followed suit. The venerable 1024 x 768 XGA became 1366 x 768 WXGA and full 1920 x 1080 pixel displays became fairly common as well, albeit more on the desktop than in mobile systems. Professional desktops such as the 27-inch Apple iMac adopted 2560 x 1440 resolution. On the PC side of things standards proliferated in various aspect ratios, resulting in unwieldy standards terminology such as WSXGA+ (1680 x 1050) or WQXGA (2560 x 1600).

    An interesting thing happened. Whereas in the past, TVs and computer displays had very different ways to measure resolution, they're now more and more the same, what with flatscreen TVs really being nothing more than very large monitors. The 1920 x 1080 pixel 1080p format, in particular, is everywhere. Amazingly, that's becoming a bit of a problem.

    Why? Because as TVs become ever larger, the same old 1080p resolution no longer looks quite as high definition as it did on smaller screens. Put a 42-inch and a 70-inch TV next to each other and you can plainly see the degradation in sharpness. The situation isn't as drastic on notebooks because, after growing ever larger for many years, notebook displays have leveled off in size, and have even shrunk again (Apple no longer makes the 17-inch MacBook Pro, for example). Desktop monitors, however, keep getting larger (I use two 27-inch monitors side-by-side), and that means even "high definition" 1920 x 1080 doesn't look so good anymore, at least not for office-type work with lots of small text. While I was excited getting a reasonably priced HP Pavilion 27-inch 1080p IPS monitor for the PC sitting next to my Mac, I find it almost unusable for detail work because the resolution just isn't high enough to cleanly display small text.

    While current resolution standards are running out of steam on larger displays, the situation is quite different in the small screens used on handhelds and smartphones. There we have a somewhat baffling dichotomy where many industrial handhelds still use the same low-res 320 x 240 QVGA format that's been used since the dawn of (computer) time, whereas the latest smartphones have long since moved to 1280 x 800 and even full 1920 x 1080 resolution. Tablets, likewise, pack a lot of pixels onto the rather small 7-inch and 10-inch formats that make up the great majority of the tablet market. Apple led the way with the "retina" 2048 x 1536 pixel resolution on the 3rd generation iPad. That's like a 2x2 matrix of 1024 x 768 pixel XGA displays all in one small 9.7-inch screen. Trumping even that, the latest Kindle Fire HDX tablet packs an astounding 2560 x 1600 pixel onto its 8.9-inch screen. So for now, smartphones and tablets are at the front of the high-resolution revolution.

    Somehow, we quickly get used to higher resolution. Older displays that we remember looking great now look coarse and pixellated. With technology you can never go back. The state-of-the-art almost instantly becomes the acceptable minimum. Whereas our eyes used to expect a degree of blurriness and the ability to see individual pixels on a screen, that's less and less acceptable as time goes on. And it really does not make much sense to declare 1080 as "high definition" when by now that resolution is used on anything between a smartphone and an 80-inch TV.

    Fortunately, the next thing is on the horizon for TVs and monitors, and it's called 2k4k, which stands for 2,000 x 4,000 pixel. 2160p would be a better name for this likely standard as it is simply a 2x2 matrix of current four current 1080p resolution displays, or 3,840 x 2,160 pixel. That still only means that a giant new 80-inch screen will have no more than the pixel density of a 1080p 40-inch display, but it's certainly a logical next step.

    I had all of this on my mind, when I received an email offer from one of my favorite electronics places. It was for a 39-inch Seiki TV/monitor with 4k resolution for a very attractive price and free shipping. I impulsed-ordered it on the spot, telling myself that I need to know where the 4k technology stands and what, at this point, it can and cannot do. And this would finally be a monitor where I could watch the 4k video my GoPro 3 Black Edition can produce.

    So I got the Seiki and it's a great deal and bargain. Or it would be if I actually had anything that could drive a 2k4k display in its native mode, which I don't. In fact, at this point there is virtually nothing that can drive a 4k display in full 4k 3840 x 2160 pixel resolution. Yes, the 4k videos from my GoPro 3 Black Edition would probably look great on it, but that would require me to copy the video footage to a PC that can drive an external 4k monitor, which virtually no stock PCs can do today. DVD or Blu-Ray players certainly can't display in 2k4k, and even brand-new gear like the Sony PS4 game console can't. I COULD, of course, get a low-end 4k-capable video card from AMD, but I am not sure any of the PCs in the RuggedPCReview office could actually even accommodate such a card.

    The unfortunate truth is that as of late 2013, there's very little gear that can send a true 4K video signal to a 4K TV or monitor. Which means that most content will be viewed in up-sampled mode, which may or may not look great. This will undoubtedly become a marketing issue in the consumer space—there will be great interest and great expectations in 4K TVs, but just as was the case with 3D TVs a couple of years ago, there will be virtually no 4K sources and content. And that can make for a customer backlash. There are some very detailed news on Amazon (see here) that provide an idea of where things stand.

    What does all that mean for rugged mobile technology? Not all that much for now, but I am certain that the ready availability of super-high resolution on smartphones and consumer tablets will change customer expectations for rugged device displays just as capacitive touch changed touch screen expectations. Once the (technology) cat's out of the bag, that's it. It won't go back in.

    And just as I finished this entry, I see that Dell announced 24-inch and 32-inch UltraSharp monitors with 4k 3840 x 2160 resolution, and a 28-inch version will soon follow (see Dell news). Given that Dell is the leading flat-panel vendor in the US and #2 in the world, that likely means that we'll soon see a lot more systems capable of supporting 4k resolution.

    Posted by conradb212 at 04:53 PM | Comments (0)

    November 14, 2013

    State of Outdoor-Viewable Displays Late 2013

    One of the big differentiating factors in ruggedized mobile computers is how well the display is suited for work outdoors in bright daylight and in direct sunlight. This can make the difference between a device being useful and productivity-enhancing, or frustrating and nearly useless.

    Why is this such a big issue? Aren't today's displays so good that the only thing that matters is how large a display you want, and perhaps what resolution it should have? For indoor use that's true, but outdoors it's an entirely different story.

    The outdoor viewability problem

    Overall, LCD displays have come a very long way in the last two decades. If you're old enough, you probably remember those very small notebook displays that you could barely read. And if you looked at them from an angle, the color—if you had color—shifted in weird ways. Almost all of today's LCD displays are terrific. They are bright and sharp and vibrant, and you can view them from almost any angle (and depending on the technology, from all angles). Most of today's tablet and notebook displays are so good that it's hard to imagine they could get any better.

    Until you take them outdoors, that is.

    The difference between indoors and outdoors is amazing. A screen that is bright indoors will almost wash out when you take it outdoors on a sunny day. That's because even a very strong backlight is no match for the sun. Even very good displays become almost unreadable when they are facing the sun. The contrast goes away, the screen may go dark or it may become so reflective that it can't be used anymore. Some displays also assume strange hues and casts and colors when in the sun. Others have a shimmering iridescent look that distracts the eye. And resistive touch screens have a slightly yielding top surface that can make for optical distortions that can be very distracting.

    Matte and glossy displays

    Most notebook and tablet displays these days are very glossy. That's a trend that started perhaps a decade ago in Japan where vendors hoped bright, glossy screens would be more easily noticed in crowded electronics shop windows. Another argument for glossy displays was that they make the picture "pop" with rich color and sharp contrast when watching video or movies. That probably depends on the individual viewer, but overall glossy screens work well enough indoors (where there are few reflections) that virtually all manufacturers switched to them. Outdoors where there are a lot of reflections, glossy displays can be very hard to view.

    Some tablets and notebooks have matte screens or they have anti-glare coatings. A "matte" surface can be achieved via a liquid coating with tiny particles that then diffuse light, chemical etching that makes for a rougher surface, or mechanical abrasion. The much reduced intensity of light reflection makes matte surfaces ideal for office workers. You'd think that matte displays are also the answer for good outdoor viewability, and matte displays can indeed handle reflections much better outdoors as well. The problem, though, is that matte screens, especially older ones, just diffuse the light. When that happens outdoors where light can be very strong, the screen can turn milky and becomes hard or impossible to read.

    The different display technologies

    Most of today's standard LCDs are transmissive, which means that you have a backlight behind the LCD. This approach works great indoors because the ratio between the strength of the backlight and the reflected ambient light is very large. Outdoors, the ambient light is much stronger, and so the ratio between the strength of the backlight and the amount of reflected light is much smaller, which means there is much less contrast.

    In the past, notebook manufacturers tried different approaches to make the screens readable outdoors.

    One approach was to use reflective LCDs instead of transmissive ones. This way, the brighter the sunlight, the more readable the display becomes. This never caught on for two reasons. First, since you couldn't use a backlight, you needed a sidelight to make the screen viewable indoors. That just doesn't work with displays larger than those in a PDA. Second, even outdoors, the screens looked flat because the LCD background was greenish-brown, and not white.

    Another approach was "transflective" screens. Transflective screens were part transmissive so that you could use a backlight, but also part reflective so you could see them outdoors. This was supposed to be the best of both worlds, but it was really just a compromise that didn't work very well. So early transflective display technology was abandoned.

    Today, most outdoor displays use what one might call modified standard transmissive technology. These screens perform just like standard transmissive displays indoors, while controlling reflections and preserving contrast outdoors. They do that with various tricks and technologies such as optical coatings, layer bonding, and circular polarizers to reduce reflected light. The overall goal of all these measures is to control distracting reflections and to get the best possible screen contrast. That's because for keeping a display readable outdoors and in sunlight, preserving display contrast is more important than anything else. That's where effective contrast comes into play.

    Getac's QuadraClear brochure shows the effect of linear and circular polarizers

    Almost all major manufacturers of ruggedized mobile technologies have their own special approaches such as "QuadraClear" (Getac), "CircuLumin" (Panasonic), "MaxView" (Handheld Group), "xView Pro" (MobileDemand), "View Anywhere" (Motion Computing), IllumiView (Juniper Systems), "AllVue" (Xplore Technologies), and more.

    What matters is effective contrast

    There are various definitions of contrast. The most important one in outdoor displays is the "effective" contrast ratio, which doesn't deal with the annoying mirror-like reflections glossy screens are infamous for, but rather with the sum-total of the light reflected by the various layers a typical LCD assembly consists of. The effective contrast ratio is the ratio between that reflected light and the light generated by the display's own backlight.

    There are, in essence, two major ways to control those internal reflections. One is adding circular polarizers that combines a linear polarizer and a retardation film to block reflected light. The other is bonding together layers of the LCD assembly, thus eliminating two reflecting surfaces with every bond. How well it all works depends on exactly how these elements are combined to produce the best possible effect, and that's generally a closely-guarded secret.

    A rule-of-thumb formula to compute the effective contrast ratio of an LCD screen used outdoor is 1 + emitted light divided by reflected light. For emitted light we use the backlight, measured in nits. For reflected light we multiply moderate sunlight, which is the equivalent of about 10,000 nits, with the percentage of light reflected by the display. A normal, untreated notebook screen reflects about 2% of sunlight. A combination of optical coatings can bring that number down to about half a percent for displays without touch screens, and about 0.9% for displays with (resistive) touch screens.

    If you use this formula and plug in the numbers, you find that a standard notebook without any optical treatments has an effective contrast ratio of about 2:1, which means it's pretty much unreadable outdoors. If you boost the backlight or apply optical coatings, the contrast ratio goes up to about 6:1, which is the minimum the military requires in its computers. If you boost the backlight AND apply coating, you get contrast ratios of about 6 or 7 for displays with resistive touch, and up to 11 or 12 without.

    The bottom line with this sort of "modified transmissive" displays is that there are a number of factors that affect the effective contrast ratio and thus display viewability outdoors. It all boils down to the best possible combination of optical coatings and layer manipulations to reduce internal reflection, and a good strong backlight.

    Super-bright backlights

    If that is so, then why not just have the strongest possible backlight? That is indeed one good way to boost the viewability of an outdoor display, but it comes at the cost of either a larger, heavier battery or less battery life. There are ways to minimize the extra drain on the battery, one being a light sensor that will throttle the backlight whenever full brightness is not needed, and another the presence of an easily accessible "boost" button that lets the user turn extra brightness on or off.

    If you're wondering how backlights are measured, an often used unit is nit. The official definition of a nit is that it is "a unit of illuminative brightness equal to one candle per square meter, measured perpendicular to the rays of the source." Most consumer notebooks and tablets are in the 200 nit range, most semi-rugged and rugged device designated as "outdoor-viewable" are in to 500-700 nit range, and some "sunlight-viewable" screens go all the way to 1,000-1,500 nit.

    Does a touch screen impact outdoor viewability

    If it is resistive touch, yes, it usually does make a difference. That's because light reflects on every surface of the LCD assembly, and resistive touch adds additional surfaces, therefore lowering the effective contrast ratio. Another problem is that the soft, yielding surface of resistive touch screens results in distorted reflections that are not present in totally smooth glass surfaces.

    Capacitive touch, which is increasingly used even in rugged devices, doesn't have this problem. However, it always pays to closely examine the display under all all sorts of extreme lighting conditions as some non-resistive technologies can have distracting grid patterns that become visible outdoors.

    Hybrid approaches

    In addition to reflective, transflective and transmissive screens and the various ways to tweak them for better outdoor performance, there are some interesting hybrid approaches. One of them is Pixel Qi's enhanced transflective technology where display pixels consist of a transmissive and a reflective part that have separate drivers. In essence, that allows them to have a "triple mode" display that can use one or both technologies, depending on the lighting situation. A while ago, RuggedPCReview had the opportunity to examine the Pixel Qi technology in a detailed review (see here), and we concluded that the technology absolutely works. However, potential users need to be aware of its inherent gradual switching from full color indoors to muted color outdoors and black and white in direct sunlight as that may affect color-dependent apps.

    Most recently, Pixel Qi has come out with displays that do show colors in direct-sunlight reflective mode, but we have not have hands-on with any of those yet. Also of note is that Pixel Qi's founder and chief technology officer left the company early 2013 to work on the Google Glass project, and that's not good news for Pixel Qi.

    How about OLED?

    What about the OLED/AMOLED technology that is touted as the next big thing in flatscreen TVs, better than either LCD or Plasma? OLED screens have been used in some cameras, phones and even scuba diving computers for a while now, but I can't see them as a candidate for sunlight-viewable displays. That's because OLED technology is essentially a grid of special LEDs that easily washes out in sunlight or even bright daylight.

    Other important considerations

    On top of using the best possible outdoor-viewable display available for a given job, you also want good underlying LCD technology and good optics. A wide viewing angle makes the display easier to read, and we always strongly suggest starting with an IPS (In-Plane Switching, see Wiki) or, better yet, an AFFS (Advanced Fringe Field Switching, see wiki) screen so that viewing angles and color shifts simply aren't an issue. Some anti-glare coatings can create annoying reflections that can trick your brain into filling in detail, which makes the screen even less readable than it would be without the coating. You also don't want a display that reflects distorted images. That again confuses your brain and makes the screen harder to read. And you want a display that is as resistant to fingerprints as possible because fingerprints can become enormously visible and distracting under certain outdoor lighting conditions.

    Examples from RuggedPCReview.com testing

    Below are examples from RuggedPCReview testing that illustrate some of the issues and properties discussed:

    Below: The first image shows pretty much how the modern era of optically treated transmissive displays got started around 2007 when Itronix introduced their DynaVue outdoor-readable display technology. The image shows how DynaVue compared to an earlier non-DynaVue Itronix notebook.

    Below: All of a sudden it was possible to control reflection to an extent where displays remained viewable in direct sunlight. An older Itronix notebook with a matte display surface, by comparison, diffuses the sunlight so much that the screen is totally blown out.

    Below: Unbeknownst to most, Dell was another pioneer with optically treated outdoor-viewable displays. In 2007, a Dell Latitude ATG 630 with its 500-nit backlight offered excellent outdoor viewability with an 11:1 effective contrast ratio. Dell did that by bonding a piece of glass on the polarizer film, thus eliminating the polarizer film's reflectivity, and then treating the smooth exterior surface of the glass.

    Below: Comparison between a "modern-style" optically treated transmissive display on an Xplore rugged tablet, and the matte display on a competing product. Indoors both tablets looked great, but outdoors the difference was obvious.

    Below: The same Xplore iX104 tablet versus one of the original convertible Tablet PCs. The matte, non-treated 2002-era Toshiba Portege display simply vanishes outdoors.

    Below: Matte displays can work quite well outdoors; this DRS ARMOR tablet mutes the reflection and remains quite viewable, though you have to hunt for the right viewing angle.

    Below: That's a Getac rugged notebook on the left versus a Gateway consumer notebook with a glossy display that looked great indoors and even had decent contrast outdoors, but the glossy screen made it unusable.

    Below: A GammaTech SA14 semi-rugged notebook compared to an Apple MacBook Pro. Indoor the MacBook excels, outdoors it falls victim to reflections.

    Below: Here we see how a rugged MobileDemand T7200 xTablet compares to a Google Nexus 7 consumer tablet. Same story: indoors the Nexus screen looks terrific, outdoors it's all reflections, although the display itself remains quite viewable.

    Below: Motion CL910 tablet vs. iPad 3—the iPad's retina display is terrific and is even viewable outdoors, but the super-glossy surface is very prone to reflections.

    Below: MobileDemand pioneered using a modified Pixel Qi display in a rugged tablet, their xTablet T7200. Note how the display works under all lighting conditions, from indoors to direct sun where the display switches to gray-scale reflective.

    Below: An example of the interesting optical properties of treated outdoor displays. The two Motion Computing tablets are both excellent outdoors, and have some of the best displays anywhere, but it's clear that they have different optical treatments.

    Below: Another example of the odd optical properties of some displays. This one is basically quite viewable, but the wavy, distorted reflections make it difficult to use.

    Below: An example of brute force—the Getac X500 rugged notebook has a superbright backlight that, combined with very good optical treatment, makes this one of the best displays available.

    So what's the answer?

    While there are a number of interesting alternative display technologies, at this point, the best overall bet is still a combination of optical treatments and coatings, direct bonding to reduce the number of reflecting surfaces, a reasonably strong backlight, and a touch technology with as little impact on display viewability as possible. The following all contribute to the best currently possible outdoor-viewable display:

    • IPS or AFFS LCD (for perfect viewing angle)
    • Anti-reflective and anti-glare treatments
    • Circular polarizers (combination of a linear polarizer and a retardation film to block reflected light)
    • Minimal number of reflective surfaces via direct bonding
    • Sufficiently strong backlight
    • Hard, flat surface to eliminate distortions
    • Suitably high resolution
    • Touch technology that does not show, distort or create optical aberrations
    • Surface that's not prone to fingerprints

    As is, most major vendors of rugged mobile computing technology offer outdoor/sunlight-viewable displays standard or as an option. Most perform quite well, though there are significant differences that can really only be evaluated in side-by-side comparison, as the industry does not generally disclose exactly how displays are treated. Such disclosure, and also the inclusion of effective contrast ratio into product specs would be tremendously helpful. That, of course, would require generally agreed-on standard definitions and testing procedures.

    The last word in outdoor-viewable display technology has not yet been spoken, and it's more than likely that disruptive new technologies will replace what's available now. However, today's technology has reached a point where it can be good enough to allow work in bright daylight and even direct sunlight, though it definitely pays to see for yourself which particular implementation works best for any given project. -- Conrad H. Blickenstorfer, November 2013

    (For the definite article on the modified transmissive approach, see "GD-Itronix DynaVue Display Technology" by Geoff Walker. It goes back to 2007, but all the principles remain valid today. Thanks also to Mr. Walker for feedback and suggestions for this article).

    Posted by conradb212 at 10:51 PM | Comments (0)

    October 23, 2013

    Two annoying trends

    Today I am going to rant a bit about two trends that simply make no sense to me.

    The first is "skeuromorphism." It's the new fashion word-du-jour, what with Apple and Microsoft demonising it as if it were some sort of evil plague. As is, Wiki defines skeuromorph as "a derivative object that retains ornamental design cues from structures that were necessary in the original." That includes, of course, many elements of graphical user interfaces. The desktop metaphor, after all, has been at the very core of every graphical user interface for the past 30 years.

    But now that very quality, to make objects on a computer screen look just like the real thing, has come under heavy attack. And that even includes the three dimensional nature of the real world. Apple, especially, but also Microsoft, now want everything to be flat. As flat as possible. Anti-skeuromorphism forces argue that the public has now been exposed to computers long enough to no longer need the analogy to real, physical things. And so, in the eyes of many, the latest versions of many operating environments, and Apple's iOS, look dreadfully flat and barren indeed.

    In Apple's case, one could well argue that a bit of a backlash against skeuromorphic excess was probably a good thing. Apple, long the champion of good design, had begun slipping with some truly kitschy stuff, like the "leather-bound" address book, the Ikea-style wooden bookshelf and other affronts that likely would have had a healthy Steve Jobs froth at the mouth. So a bit of streamlining things was in order, but when I look at the sad, sparse, flat expanse and eensy-tiny lettering that now mars the iOS and many of its apps, the sad icons that look like they just want to vanish from view, and the rest of the bleakness that now adorns iPhones and iPads, I wish Jonathan Ive and colleagues would have stuck with hardware.

    You could argue, of course, that after decades of visual improvements and fine-tuning, the anti-skeuromorphism crusade simply rings in the advent of fashion in electronic interfaces. Just like fashion goes into extremes just to then denounce the trend and swing into the other extreme (neatly obsoleting billions of dollars worth of product in the process), perhaps we'll now have to put up with anemic, flat computer and tablet screens until the trend reverses and everything becomes three dimensional and lively again.

    Okay, the second trend is that to thin and slender hardware design at all cost. The just announced new Apple iPad Air is hailed as a wondrous achievement because it's thinner yet and weighs even less. It's a veritable Barbie of a tablet. And since this is Apple, and Apple decreed some years ago that computing devices need to be rectangular and very flat, we now have hyper-slim smartphones, hyper-thin tablets, and even hyper-thin iMacs, which in the latters' case makes absolutely no sense since they sit on a desk in front of you. And we also have hyper-thin HDTVs. Size is okay as we now have smartphones with screen sizes approaching 6 inches and flat screen TVs approaching 90 inches. But it all must be very flat and thin.

    Why?

    I mean, making that technology so very flat simply makes it more difficult to design and manufacture, and since hardware happens to be a physical thing it often loses utility if it's pressed into too flat of a design (the new iPad Air's battery is down to 32.9 WHr, vs. the iPad 4's 43 WHr). The dreadful sound of flat-screen TVs is a prime example, and the so-so battery life of many super-slim smartphones another. More and more the trend to supreme thinness seems more a narcissistic quest to prove that one's technology is so advanced that mere physicality no longer matters. Sort of like a supermodel starving herself into a skeletal, gaunt appearance just to be lauded for her discipline and elegance.

    It makes no sense. I mean, the latest Crossover probably weighs almost 5,000 pounds, a house weighs an awful lot, and American people weigh more all the time, too. So why is ultimate thinness in an electronic device such a virtue?

    And it especially makes no sense for rugged devices where the very physicality of the design provides the structure and toughness to make it last on the job. And where a degree of volume means it'll run cooler and provide space for expansion and versatility. Yet, even rugged device are getting thinner all the time. They have to, or the public, even customers in enterprise and industrial markets, will stay away.

    So there, two silly trends. And trends they are, as you can't keep making physical stuff thinner beyond a certain point. Once that point is reached, the quest is over, and the pendulum will reverse or go elsewhere. It's quite possible that the next Steve Jobs will some day present the latest super-gadget, and it will be egg-shaped. It's possible.

    Be that as it may, I just hope that technology will stay as free from fashion dictates as possible. Because once it takes fashion to sell gear, that means innovation is over.

    Posted by conradb212 at 08:21 PM | Comments (0)

    October 18, 2013

    Rugged Android device comparison table, and contemplations over Android in the rugged market

    On October 18, 2013, RuggedPCReview.com launched a rugged Android device comparison table. The table allows interested parties to view full specifications of all rugged handhelds and rugged tablets we're aware of.

    Given the absolutely massive number of Android devices activated worldwide -- about a billion -- it's amazing how few rugged Android devices are available. As we recently reported, both Honeywell/Intermec and Motorola Solutions have launched initiatives to make rugged Android devices available to industrial and enterprise markets, and other manufacturers are offering ruggedized Android-based handhelds and tablets as well. But there aren't many actual devices, probably less than a couple of dozen in both categories combined. And even that small number includes products that are available with either Android and one version of Windows Mobile or another, which means they aren't really optimized for either.

    Add to that the fact that few of the available Android-based rugged devices are on the latest, or even a recent, version of Android, and that much of the hardware isn't anywhere near the technological level of consumer smartphones and tablets, and one has to wonder all over again why Android has such a terribly hard time to get going in rugged/industrial devices.

    On Microsoft's website you'll find a white paper entitled "Choose Windows Mobile Over Android for Ruggedized Handhelds" written by Gartner in February 2011 (see here). Among the key recommendations there were to "remain with Windows Mobile for ruggedized handheld-computer solutions, and to prepare for a transition to full Windows in subsequent implementations" and to "limit the scope of Android-based ruggedized application development through 2013." Of course, the two and a half years since Gartner issued the paper is an eternity in mobile electronics. At the time they still mentioned Android as the #2 smartphone OS behind Symbian!

    Gartner also cautioned that the lack of FIPS-140 compliance (FIPS 140 is a U.S. government computer security standard that specifies cryptographic requirements) was an issue for Android, and they predicted that enterprise software vendors would probably create HTML5-based client applications with cross-platform abstraction layers to provide some support of Android devices. FIPS-140 compliance was indeed an issue with Android, and one that's even now still only addressed on a by-device level. Cross platform application development is now available via platforms such as RhoMobile and iFactr.

    I don't know how widely read Gartner's 2011 report was, but over the past three years the rugged computing industry certainly heeded Gartner's advice of choosing Windows Mobile over Android for ruggedized handhelds. Gartner's 2011 arguments made sense, but probably even Gartner didn't foresee that the installed base of Android devices would grow from under 200 million back in 2011 to a cool billion today. One could easily argue that playing it safe with Windows Mobile precluded participating in the rapid, massive learning curve with Android over the past two or three years.

    There are no good answers, and hindsight is always 20/20. Except that even two or three years ago it was quite obvious that Windows Mobile was doomed, and Microsoft did not seem to have a compelling roadmap in the mobile space. In historic terms, the predicament the rugged handheld and tablet manufacturers have been facing over the Android issue is no less harrowing than the predicament they faced a decade and a half ago when there was increasing pressure to abandon their various proprietary operating platforms in favor of Windows CE.

    What's the answer? Hard too say. It is impossible to ignore a user base of a billion and counting, because that billion already knows Android and how it works. On the other hand, Android's fragmentation is vexing, there remain questions about platform security (see overview of Android security), and the fact that Android' was as clearly designed for capacitive multi-touch as Windows was for a mouse makes it less than perfect for gloves and wet places. At this point it is also still possible that Microsoft might somehow pull a rabbit out of its hat with Windows Embedded 8 Handheld, causing a percentage of the rugged mobile market to go one way and the consumer market another. Remember that the Palm OS once dominated the mobile OS market to an extent where Symbol (now part of Motorola Solutions) had a Palm-based industrial handheld (see here) before the tide went the other way.

    Posted by conradb212 at 06:28 PM | Comments (0)

    October 02, 2013

    October 1, 2013 -- the day Moto Solutions and Honeywell/Intermec became serious about Android

    This has been quite the day for Android in the rugged handheld space.

    Intermec, now part of Honeywell Scanning & Mobility, announced the CN51 rugged mobile computer. It is an updated version of Intermec's successful CN50, but has a larger, higher resolution screen (4.0-inch, 800 x 480) that now uses resistive multi-touch, updated WiFi, WAN, Bluetooth, camera, and scanners, and it's now based on the dual-core 1.5GHz TI OMAP 4 processor, which means Intermec can offer the CN51 either with Microsoft Windows Embedded Handheld 6.5 OR with Android 4.1.

    And Motorola Solutions, the same day, announced that three of its popular enterprise mobile computers would now be available with Android Jelly Bean, fortified with Moto's own Mx security, device management and performance features that they added to the standard Android OS. So as a result, the following three Motorola Solutions devices will now be available with Android Jelly Bean:

    Motorola Solutions Android mobile computers
    Product ET1 MC40 MC67
    Type Enterprise tablet Rugged Handheld Enterprise PDA
    Display 7-inch (1024 x 600) 4.3-inch (480 x 800) 3.5-inch (480 x 640)
    Digitizer Capacitive multi-touch Capacitive Resistive
    Original OS Android Android Embedded Handheld 6.5
    Available OS Android Jelly Bean Android Jelly Bean Android Jelly Bean
    RAM 1GB 1GB 1GB
    Storage 4GB Flash + microSD 8GB Flash 8GB Flash
    Size 8.8 x 5.1 x 1.0 5.7 x 2.9 x 0.8 6.4 x 3.0 x 1.3
    Weight 1.4 lbs. 9.4 oz. 13.5 oz.
    CPU 1GHz OMAP 4 800MHz OMAP 4 1GHz OMAP 4
    Scanning via 8mp camera SA4500-DL SE4500-SR/DL/DPM
    WWAN Data-only NA Voice and data
    Op. temp 32 to 122F 32 to 122F -4 to 122F
    Sealing IP54 IP54 IP67

    Motorola Solutions points out that their three Android offerings are not only benefitting from the Mx extensions (see Mx website), but also from the company's RhoMobile Suite (see RhoMobile page), a cross-platform development tool used to create apps that are OS-independent. Which means the Moto Android devices can run standard Android apps, or HTML 5 cross-platform apps created with RhoMobile.

    Here's what Motorola Solutions had to say about the emerging Android devices:

    And below is Intermec's introduction to their CN51 that can run both Windows Mobile and Android:

    What does it all mean, this pretty high visibility push of Android in business and enterprise class devices? Well, there's the odd situation that while almost all smartphones are either iPhones or Android devices, virtually all industrial handhelds still run one version of Microsoft's old Windows CE or Windows Mobile or another. Which means that most industrial handhelds are by no means ruggedized equivalents of the smartphones a good part of humanity already knows inside out. Which results in extra training, an extra learning curve, and the near certainty that change will come anyway. Up to now, rugged mobile computing manufacturers have been remarkably conservative in acknowledging that trend, and generally limiting themselves to offering an exploratory Android version or two of a Windows device running on similar hardware. What we're now beginning to see is the next step, that of making the hardware so that it can run both old and new.

    Now, no one wants to alienate Microsoft, of course, and so things are worded carefully. Intermec's press release includes a quote from industry analyst David Krebs, VP of mobile & wireless at VDC: "While rugged devices are designed to be more function or application-specific than smartphones, there is growing consensus that these devices deliver a similar immersive experience and have similar capabilities as function-rich smartphones. As Android matures in the enterprise it represents an increasingly viable option for rugged vendors such as Intermec to bridge this functionality gap and deliver the capabilities their partners and customers are looking for."

    On the Motorola Solutions side, Girish Rishi, Senior VP of enterprise solutions, said, "Now, businesses have more choices and can better manage rapid changes in the market by using Motorola’s tools that deliver complete solutions in less time while protecting their mobile investments.”

    It's fairly safe to assume that these are just first steps. The proposed hardware still represents compromises and is not (yet) truly Android optimized. But the message communicated by both Intermec/Honeywell and Motorola Solutions is quite clear: We can't wait any longer, Microsoft. We need to get with the program before we lose our markets to consumer smartphones in a case.

    Posted by conradb212 at 12:09 AM | Comments (0)

    August 11, 2013

    Optimizing the legacy Windows interface for touch and tablets

    Tablet computers have been around for a quarter of a century, but it wasn't until the iPad's introduction that the tablet form factor took off. That's in part because the technology wasn't quite ready for tablets, and, in a much larger part, because Windows just didn't work well with tablets. Tablet historians will remember that both the first generation of tablets (circa 1992) and the second one (circa 2001) primarily used Wacom active digitizer pens. Those pens were (and are) precise enough to operate the Windows user interface, especially when aided by special Windows utilities and accommodations (Windows for Pen Computing back in 1992, and the Windows XP Tablet PC Edition in 2002).

    Ever since the iPad came onto the scene, the market has been demanding touch instead of pens. Touch works great with operating environments designed for touch, such as iOS and Android, but not nearly as well with Windows. As of late, we've seen a number of new Windows tablets that use either resistive touch or even projected capacitive touch. Resistive works best with a stylus, though a firm finger touch usually also works, albeit not very precisely. Capacitive touch and legacy out-of-the-box Windows, however, are not a great match (except for the new Metro environment in Windows 8).

    There's not much that can be done to remedy that situation. Many people want and need Windows on a tablet, but in order for it to work like Windows, it needs the full Windows user interface, and that is simply not designed for pen and touch. As a result, if you work on a tablet with a small screen and limited resolution, your screen may look like it does in the picture below:

    The small scrollers, check boxes, menus and text work perfectly well if the tablet is used with a mouse, but they are almost impossible to use with touch.

    Fortunately, there are ways to improve the situation, if only to some extent. And that is done through customization of the Windows interface. Here's how it's done.

    In Windows 7, go to Control Panels, then select Personalization. Go to the bottom of the screen and click on Window Color. At the bottom of that window, select Advanced Appearance Settings... What then shows up is the Window Color And Appearance tab that has been around for many years. It lets users adjust the size of numerous Windows interface items.

    The "Item" pulldown provides access to:

    3D Objects (lets you select the color)
    Active Title Bar (select size and font)
    Active Window Border (lets you select the color)
    Application Background (lets you select the color)
    Border Padding (select size)
    Caption Buttons (lets you select the color)
    Desktop (lets you select the color)
    Disabled Item (lets you select the font color)
    Hyperlink (lets you select the color)
    Icon (select size and font)
    Icon Spacing (Horizontal) (select spacing)
    Item Spacing (Vertical) (select spacing)
    Inactive Title Bar (select size, color and font)
    Inactive Window Border (select size and color)
    Menu (select size, color and font)
    Message Box (select font, font size and color)
    Palette Title (select size, font and font size)
    Scrollbar (select size)
    Selected Items (select size, color, font, font size and color)
    ToolTip (select color, font, font size and color)
    Window (select color)

    Here's what it looks like:

    The default values for all of those items are, surprise, what works best for a desktop computer with a mouse and a large screen. Problem is, those default desktop/mouse settings are also what virtually all Windows tablets come with. It's as if vendors thought Windows worked equally well on any size and type of platform, which it definitely does not.

    As a result, Windows, which isn't very suitable for tablets in the first place, is even worse with the desktop/mouse default settings. So how are we going to remedy that situation? Not easily, but it can be optimized to work better on touch tablets.

    Colors, obviously, don't make a difference. So let's forget about all the interface items where you can only change color. Icon size and spacing is also pretty much a matter of choice, as is font color, so let's drop those as well. That leaves:

    Active Title Bar (select size and font)
    Border Padding (select size)
    Inactive Title Bar (select size, color and font)
    Inactive Window Border (select size and color)
    Menu (select size, color and font)
    Message Box (select font, font size and color)
    Palette Title (select size, font and font size)
    Scrollbar (select size)
    Selected Items (select size, color, font, font size and color)
    ToolTip (select color, font, font size and color)

    Inactive items also don't matter, and neither does stuff you don't actually interact with, so let's drop those. That leaves:

    Active Title Bar (select size and font)
    Border Padding (select size)
    Menu (select size, color and font)
    Palette Title (select size, font and font size)
    Scrollbar (select size)
    Selected Items (select size, color, font, font size and color)

    Now we get to what matters. Here's why and how:

    Active Title Bar -- contains the three all-important boxes that minimize a window, maximize it, or close it. Those are used all the time. They must be large enough to easily operate with touch or a stylus. (On a 7-inch 1024 x 600 pixel display, I set title bar size to 32 and font size to 11).

    Border padding -- this one is annoyingly important. It's important because your finger or stylus must connect with the border to resize it. It's annoying because a border large enough to easily manipulate is an eye sore and takes up too much space, making Windows look clumsy.

    Menu -- not really, really important, but you'd like to be able to see the menu choices, so make them large enough. I used 32 for size, and 11 for font size.

    Palette Title -- honestly, I don't know why I left this one in there. I set it to 32 size and 11 font size.

    Scrollbar -- perhaps the most important one of all. If you need to scroll up and down and left and right with your finger or a stylus, it MUST be large enough to easily touch. I made it 32, and the font 11.

    Selected items -- that's the choices row when you select something from a menu. It needs to be large enough to read and select from via touch. Made it 32 and the text 11.

    So there. Once you've done this, Windows won't be that terribly difficult to operate with your fingers or a stylus. It's not going to look very pretty, and you'll use up far too much valuable screen real estate with interface items designed for a mouse, and now resized to make them work as well as they possibly can with touch.

    And here's what it looks like. Note the much larger scrollers, menus, window borders and text:

    Another good thing to know is that you can save those settings as themes (see image below). Which means you can quickly toggle between a theme that's optimized for use with a mouse (or when the tablet is hooked up to a larger external screen), and when it is used as a tablet with finger touch.

    Can that be done in Windows 8 as well? Not as easily. The "Advanced Appearance Settings" is gone from the Windows 8 (legacy) Control Panel. To change the interface, users can change text size in the Display control panel. But everything else can only be adjusted with the Registry Editor, which is not for the faint or heart (see how it's done).

    Posted by conradb212 at 03:48 PM | Comments (0)

    June 25, 2013

    Logic Supply's logical approach to engineering their own systems

    When it comes to rugged computing gear, most people interested in this industry know the big players that dominate the market and get all the media coverage. But that's not everything there is. Unbeknownst to many outside of the circle of customers and prospects, a surprising number of smaller companies are designing and manufacturing rugged computing systems of one type or another. At times we come across them by chance. Other times they find us.

    And so it was with Logic Supply, located in South Burlington, a small town in the northwestern part of Vermont. They call themselves "a leading provider of specialized rugged systems for industrial applications," and asked if we could include them in our resource page for rugged system vendors. We could. And that led to some back and forth where I learned that while Logic Supply distributes a variety of rugged/embedded systems and components, they have also begun developing their own high-end chassis under their own LGX brand. That was interesting, and so we arranged an interview with Rodney Hill, Logic Supply's lead engineer, on the company, and how they go about creating their own, home-developed solutions in addition to being a distributor of products engineered elsewhere.

    RuggedPCReview: If you had to describe Logic Supply’s approach to case and system engineering in a minute or less, what would you say?

    Rodney Hill (Logic Supply): So, LGX is Logic Supply’s engineering arm. The design approach for LGX systems and cases can be boiled down to three ideas. First is our “designed to be redesigned” philosophy. Seed designs that are scalable and modular. From a seed idea we can create a product line through swappable front- and back-plates or resized geometry. Second is mass customization — by using standardized screws, paints, sheet metal folds, and design concepts, we leverage mass produced hardware whenever possible to keep the cost low. And through our modular designs we customize the off-the-shelf by using “upgrade kits,” which are quick to source and are cost-effective. Finally, innovation, not invention! There is a difference. Add value to things that work well, but do not re-invent the wheel.

    Was that under a minute?

    RuggedPCReview: Almost. But now let’s expand. You said scalable, modular, and “designed to be redesigned.” What do you mean by this?

    Rodney Hill (Logic Supply): Designing a new chassis is four to five times the price of redesigning a seed design. Much of the time wasted in projects is done so while selecting paints, screws, boxing, foam, metal fold designs, etc. By using standardized design methods and seed concepts, our team can immediately start adding value. Ultimately the customer is only paying for the design and not the time the engineers spent trying to get their act together. We will be faster and more focused on quality and containing costs and risk.

    So, to your question, designed to be redesigned systems from LGX have already incorporated flexible features to accommodate 80% of the customizations that customers request with off-the-shelf hardware. The last 20% are resolved with ‘upgrade’ kits that will be included with the off-the-shelf chassis kit. But you’re also using the proven benefits of the rest of the chassis (EMI and RFI shielding, for instance), and only adding risk in small portions. Meaning the rest of the chassis is still meeting all the same design criteria it was originally intended to support. So you can easily customize it without the risk of any negative effects on any of those features.

    In terms of scalability versus modularity — there are design themes seen in our cases. If you look carefully enough, you can begin to see connections between designs. The NC200 and the MK150 are two totally different designs – however they share about 80% of the same DNA, from vent holes to metal folds, etc.

    RuggedPCReview: How does cooling play into ruggedization?

    Rodney Hill (Logic Supply): Nature always wins. Meaning dust and water will destroy everything if given the chance. You need to decide how long the computer needs to live, and how much you’re willing to pay for it. Heat will shorten the life of components.

    So in terms of chassis design concepts: Keep the chassis cool as possible and as quiet as possible. Intelligent design is required to incorporate standardized cooling methods and proven airflow paths to cool many types of devices. Fan diameter, placing, vent design all will have effects on the acoustic design as well. Logic Supply will engineer noise out of systems with fan-muffling technologies, maximizing air throughput with smaller, more simple fans by identifying inefficiencies in orifice design. In short, having a fan against a grille will kill 50-60% of the airflow and multiply the noise by two or three times.

    Vent holes equal dust. Dust causes fans to break, which in turn results in hot computers. Eliminate vents and go fanless. The operational temperatures and ruggedness greatly increase. Logic Supply defines “fanless” different than the IMP mass market. Our definition is not simply “no fans.” It is more than that: no fans, no vents for dust and dirt, and maintain the ability to cool the computer system at 100% duty load for hours and days at a time. We want these systems to be heavy duty, and also to be able to last a long time. It is rated for high performance!

    RuggedPCReview: Can you talk about the design process? How long does it take from start to finish?

    Rodney Hill (Logic Supply): It happens pretty fast. This year we’ve done a fanless Mini-ITX case, a 1U rackmount case, a NUC [Next Unit of Computing] case, and we’re finalizing a fanless NUC case right now. We’ve also finished a number of customer-specific designs. These design concepts typically originate in sales — you know, this customer wants to do X and none of our existing solutions do it. But because we use seed designs, we don’t start from scratch. It really all depends, but usually designs take under three weeks, and prototypes are ready a few weeks after that. We review, test, and modify, then we’re typically getting production units in-house around five or six weeks after that.

    These core platforms can then be sold off-the-shelf, or customers can either go the semi-custom route or more radically modify the design. For simple modifications (like back-plates, front-plates, and simple changes) maybe one to five days in design and a three to four day lead on parts. For customized chassis design with samples, five to six weeks, and four to six after that to mass production.

    RuggedPCReview: Alright, finally, can you give us an example of a successful customer product development?

    Rodney Hill (Logic Supply): Sure. Last year we worked with a company called StreamOn to make a custom appliance with off-the-shelf components. StreamOn offers streaming audio solutions for the radio broadcast industry. The hardware they were using at the time was going End-of-Life, and they also needed a more specialized embedded system because their business was growing and they wanted to offer more features to their customers. They needed a variety of other things — outsourced fulfillment and things like that — but from an engineering perspective it was mostly that — the EOL and specialization. And all while remaining affordable for their customers. We worked from an existing system design — the ML250 — and customized it toward what they needed. We added an SSD, LCD screen and multifunction buttons, and on-case branding.

    Ultimately, the system we created was something like 30% smaller, and it was fanless, so it was more efficient, and had a longer life expectancy. It also had a built-in watchdog timer and auto restart bios so it could avoid any complications related to sudden power outages, etc. And it actually ended up being even less expensive for their customers than what they were previously offering. So that all worked out quite well. In fact, they recently won the [Radio World] “Cool Stuff Award,” which was pretty, well, cool!

    This whole process was consistent with our typical design timeline, by the way. From the initial conversations to mass production — with samples and prototyping — we took about three months.

    Posted by conradb212 at 03:16 PM | Comments (0)

    June 24, 2013

    Why the JTG Daugherty NASCAR racing team chose rugged Dells

    The Christmas tree began its count-down. Yellow, yellow, yellow, GREEN! For an anxious moment, the racing slicks of my supercharged Acura fought for traction, then bit. 8,000, 8,500, 8,800 rpm, shift. Shift. Shift. Shift, and the 1/4-mile at Sacramento Raceway was over. The car slowed and I reached over to stop data logging on the laptop securely sitting on its mount, just having recorded tens of thousands of data points as the car shot down the track. The laptop was a Dell Latitude ATG 630D, connected via USB to the Hondata ECU under the dash of the car. Minutes later I would analyze the run on the Dell, temperatures, shift points, slippage, air/fuel ratio, knocks, timing, etc., and then make changes on the fly. The next heat was in less than 15 minutes.

    At the time I didn't know that a few years later I'd be talking with the JTG Daugherty NASCAR racing team about how they used rugged Dell laptops on their #47 Sprint Cup car, driven by NASCAR legend Bobby Labonte. Labonte won the Cup in 2000 during an era where he was a perennial contender. And also won IROC in 2001, following in the footsteps of his brother Terry Labonte, also an IROC and Cup champion. Now a senior amongst NASCAR drivers at age 49, Labonte's piloting car #47 for the team owned by Jodi and Tad Geschickter, and NBA Hall of Famer Brad Daugherty. Lady Luck hasn't been too kind to them this season, but that's certainly not due to this talented group and also not due to the technology they're using. Most recently, while Martin Truex Jr. won at Sonoma Raceway in his Toyota Camry, a blown oil cooler ended Labonte's race in essentially the same Camry before it even began. Those are the breaks.

    So I felt almost guilty when I got on the phone Monday morning after that race with Matt Corey, who is the IT administrator at JTG Daugherty Racing, and Dell's Umang Patel and Alan Auyeung to discuss JTG Daugherty's use of Dell technology. Corey in particular, probably didn't feel too good after the frustrating weekend and had plenty of other things to do at their shop, but he took the call. Much appreciated.

    So how is JTG Daugherty Racing using Dell computers? And what made them decide to use Dell from the driver to the garage and the pit crew to the data center with a complete suite of Dell technology and solutions that also includes rugged Dell ATG and XFR laptops? The choice of Dell for data center and office isn't much of a surprise, given that Dell has consistently been in the top three PC vendors worldwide and in the US. What's more interesting is that JTG Daugherty also chose Dell for their rugged laptops, a field dominated by Panasonic, Getac and a number of other vendors specializing on rugged equipment.

    Corey began by explaining the inherent need of a NASCAR racing team for rugged technology. No surprises here. There's rain, dust, vibration, extreme temperatures, the whole gamut of hazards rugged mobile computing gear is designed and built to survive. Add to that the extreme time crunch as a race car is tested and prepared, the extreme need for absolute reliability in a sport where fractions of a second matter, and a race car is checked, refueled and has all of its tires changed in something like 13 seconds. Things simply must not go wrong, ever, in such an environment, and racing teams certainly cannot put up with finicky computing technology that may or may not be up to the job. As an example, Corey tells of an incident where a consumer laptop simply wasn't able to handle vibration, causing them a lot of grief.



    So as a result, JTG Daugherty now uses rugged gear. Their race engineering team has Dell Latitude E6430 ATG laptops. The ultra-rugged Dell Latitude X6420 XFR is used on the truck and trailer. They also use Windows-based Dell Latitude 10 tablets in Griffin Survivor cases supplied by Dell. All of this means that the team can collect performance stats, analyze it, and make changes quickly and reliably. "We have connectivity everywhere," said Corey. "As the car chief makes a decision about a change to the car, for example, he now notes this on his Latitude 10 and the information is instantly communicated to everyone across the organization. All decisions from the car chief trickle down to updates to the racecar and with everyone synced together with tablets and other Dell technology, that information flow is now much faster, more reliable and more efficient."

    But still, why Dell for the rugged gear? Here I expected Corey to point to the advantage of dealing with a one-stop vendor. Instead he says that they had used Toughbooks in the past and liked them, but that "they really didn't change much over the years, same always," and that Dell updates more frequently. "We don't want "plain vanilla," he said, "we want to be on the cutting edge of technology" and lists the memory and processing speed required to power through race simulations, high resolution imaging, and massive data sets.

    Staying at, or near, the leading edge in technology while still adhering to the longer purchase cycles and life spans of rugged equipment, and guard against obsolescence of docks, peripherals, accessories and software has always been a challenge for the rugged computing industry. While Intel tends to unveil a new processor generation and ancillary technology every 12 to 18 months, the rugged industry cannot possibly update at the same pace. Even Dell is not immune in that regard; as of now, the rugged XFR laptop is still based on the E6420 platform.

    Yet, Dell does have the advantage of very high production volume and with that comes access to the very latest technology. Combine that with the convenience and peace of mind of dealing with a large one-stop shop, and it's no surprise that even a NASCAR racing team chose Dell.

    See NASCAR Team Selects Dell to Speed Past the Competition, Dell's Latitude for Rugged Mobility page, and RuggedPCReview.com's most recent reviews of the Dell Latitude ATG and Dell Latitude XFR.

    Posted by conradb212 at 07:47 PM | Comments (0)

  •