Industry sponsors:
HOME | Notebooks | Tablets | Handhelds | Panels | Embedded | Rugged Definitions | Testing | Tech primers | Industry leaders | About us
Sponsors: Dell Rugged | Durabook | DT Research | Estone Technology | Getac | Handheld Group | Havis
Janam Technologies | Juniper Systems | MobileDemand | RuggON | Trimble | Teguar | Winmate | Xplore Technologies

December 28, 2019

Thoughts about Computer Benchmark Testing

Unlike vehicles whose specifications include a horsepower rating, computers don’t have a single number that indicates how powerful they are. True, horsepower (what an antiquated term that is) isn’t really all that relevant, because that number alone doesn’t tell the whole story. The same engine that makes a light vehicle a rocket might struggle in a larger, heavier vehicle. And even if the vehicles weighed the same, performance might be different because of a number of other factors, such as transmission, aerodynamics, tires, and so on. That’s why car magazines measure and report on various aspects and capabilities of a vehicle: acceleration, braking, roadholding, as well as subjective impressions such as how effortless and pleasant a vehicle drives and handles.

So how is performance measured in computers, and why is “performance” relevant in the first place? To answer that, we first need to look at why performance is needed and what it does in both vehicles and in cars. In vehicles, performance clearly means how effortlessly the engine moves the car or truck. Step on the gas, and the vehicles instantly responds. And it has enough performance to handle every situation. That said, absolute peak performance may or may not matter. On the racetrack it does matter, but what constitutes enough performance for tooling through town and everyday driving is a different matter altogether. And that’s pretty much the same for computers.

The kind of performance that matters

The kind of performance that matters most in computers is that which enables the system to respond quickly and effortlessly to the user’s commands. And just like in vehicles, very high peak performance may or may not matter.

If a system is used for clearly defined tasks, all that is needed is enough performance to handle those, and everything above that is wasted. If a system may be used for a variety of tasks, there must be enough performance to reasonably handle everything it may encounter, within reason. And if a system’s task must be able to handle very complex tasks and very heavy loads, it must have enough peak performance to do that task as well as is possible.

What affects performance?

So how do we know if a system can handle a certain load? In combustion engines, the number of cylinder matters, even though thanks to turbocharging and computerized engine control that is no longer as relevant as it once was. Still, you see no vehicles with just one or two cylinders except perhaps motorcycles. Four was/is the norm for average vehicles, six is better, and eight or even twelve means power and high performance. And there, it’s much the same in computers: the number of computing cores, the cylinders of a CPU, often suggests its power. Quad-core is better than dual-core, octa-core is a frequently used suggestion of high performance, and very high performance systems may even have more.

But the number of cores is not all that matters. After all, what counts in computing is how many instructions can be processed in a given time. And that’s where clock speed comes in. That’s measured in megahertz or gigahertz per seconds, millions or billion cycles per second. More is better, but the number of cycles doesn’t tell the whole story. That’s because not all instructions are the same. In the world of computers, there are simple instructions that perform just basic tasks, and there are complex instructions that perform much more in just one cycle. Which is better? The automotive equivalent may be not the number of cylinders, but how big each cylinder is and how much punch it generates with each stroke. For many years Americans valued big 8-cylinder motors, whereas European and Japanese vehicle manufacturers favored small, efficient 4-cylinder designs.

RISC vs CISC

In computers, the term RISC means reduced instruction set computer and CISC complex instruction set computer. The battle between the two philosophies began decades ago, and it carries on today. Intel makes CISC-based computer chips that drive most PCs in the world. On the other side is ARM (Advanced RISC Machine) that is used in virtually all smartphones and small tablets.

What that means is that computing performance depends on the number of computing cores, the type and complexity of the cores, the number of instructions that can be completed in a second, and the type and complexity of those instructions.

But that is far from all. It also depends on numerous other variables. It matters on the operating system that uses the result of performed instructions and converts it to something that is valuable for the user. That can be as simple as making characters appear on the display when the user types on the keyboard, or as complex as computing advanced 3D operations or shading.

Performance depends not just on the processor, but also on the various supporting systems that the processor needs to do its work. Data must be stored and retrieved to and from memory and/or mass storage. How, and how well, that is done has a big impact on performance. The overall “architecture” of a system greatly matters. How efficient is it? Are there bottlenecks that slow things down?

The costs of performance

And then there is the amount of energy the computer consumes to get its work done. That’s the equivalent of how much gas a combustion engine burns to do its job. In vehicles, more performance generally means more gas. There are tricks to get the most peak power out of each gallon or to stretch each gallon as much as possible.

In computers it’s electricity. By and large, the more electricity, the more performance. And just like in vehicles and their combustion engine, efficiency matters. Technology determines how much useful work we can squeeze out of each gallon of gas and out of each kilowatt-hour of electricity. Heat is a byproduct of converting both gasoline and electricity into useful (for us) performance. Minimizing and managing that waste heat is key to both maximum power generation as well as efficiency if the process.

So what does all of that relate to “benchmarking”?

Benchmarking represents an effort to provide an idea of how well a computer performs compared to other computers. But how does one do that? With vehicles, it’s relatively simple. There are established and generally agreed on measures of performance. While “horsepower” itself, or the number of cylinders, means relatively little, there is the time a vehicle needs to accelerate from 0 to 60 miles per hour, or reach 1/4 mile from a standing start. For efficiency, the number of miles one can drive per gallon matters, and that is measured for various use scenarios.

No such simple and well-defined measurement standards exist for computers. When, together with a partner, I started Pen Computing Magazine over a quarter of a century ago, we created our own “benchmark” test when the first little Windows CE-based clamshell computers came along. Our benchmark consisted of a number of things a user of such a device might possibly do in a given workday. The less overall time a device needed to perform all of those tasks, the more powerful it was, and the easier and more pleasant it was to use.

And that is the purpose of a benchmark, to see how long it takes to complete tasks for which we use a computer. But, lacking such generally accepted concepts as horsepower, 0-60 and 1/4 mile acceleration and gas mileage, what should a benchmark measure?

What should a benchmark measure?

The answer is, as so often, “it depends.” Should the benchmark be a measure of overall, big picture performance? Or should it measure performance in particular areas or with a particular type of work or a particular piece of software?

Once that has been decided, what else matters? We’d rate being able to compare results with as many other tested systems as very important, because no performance benchmark result is an island. It only means something in comparison with other results.

And that’s where one often runs into problems. That’s because benchmark software evolves along with the hardware it is designed to test. That’s a good thing, because older benchmark software may not test, or know how to test, new features and technologies. That can be a problem because benchmark results conducted with different versions of a particular type of benchmark may no longer be comparable.

But version differences are not the only pitfall. Weighting is as well. Most performance benchmarks test various subsystems and then assign an importance, a “weight,” to each when calculating the overall performance value.

Here, again, weighting may change over time. One glaring example is the change in weighting when mass storage went from rotating disks to solid state disk and then to much faster solid-state disk (PCIe NVMe). Storage benchmark results increased so dramatically that overall benchmark results would be distorted unless weighting of the disk subsystem was re-evaluated and changed.

Overall system performance

The big question then becomes what kind of benchmark would present the most reliable indicator of overall system performance? That would be one that not only shows how a system scores against current era competition, but also to older systems with older technologies. One that tests thoroughly enough to truly present a good measure of overall performance. One that allows to easily spot weakness or strengths in the various components of a system.

But what if one size doesn’t fit all? If one wants to know how well a system performs in a particular area like, for example, advanced graphics? And within that sub-section, how well the design performs with particular graphics standards, and then even how it works with different revs of such standards? That’s where it can quickly get complex and very involved.

Consider that raw performance isn’t everything. A motor with so and so much horsepower may run 0-60 and the 1/4-mile in so and so much time. But put that same motor in a much heavier car, and the vehicle would run 0-60 and the 1/4-mile much slower. Computers aren’t weighed down by weight, but by the operating system overhead. Recall that the earliest PCs often felt very quick and responsive whereas today’s systems with technology that’s hundreds or thousands of times as powerful can be sluggish. Which means that OS software matters, too, and its impact is rarely measured in benchmark results.

Finally, in the best of all worlds, there’d be benchmarks that could measure performance across operating systems (like Windows and Android) and processor technology (like Intel x86 and ARM). That does not truly and reliably exist.

How we measure performance

Which brings me the way we benchmark at our operation, RuggedPCReview.com. As the name indicates, we examine, analyze, test and report on rugged computers. Reliability, durability and being able to hold up under extreme conditions matter most in this field. Performance matters also, but not quite as much as concept, design, materials, and build. Rugged computers have a much longer life cycle than consumer electronics, and it is often power consumption, heat generation, long term availability, and special “embedded” status that rate highest when evaluating such products. But it is, of course, still good to know how well a product performs.

So we chose two complete-system benchmarks that each give all parts of a computer a good workout. We decided to standardize on two benchmarks that would not become obsolete quickly. And this approach has served us well for nearly a decade and a half. As a result, we have benchmarks of hundreds of systems that are all still directly comparable.

A few years ago, we did add a third, a newer version of PassMark, mostly because our initial standard version no longer reliably ran on some late model products.

Do keep all of that in mind when you peruse benchmarks. Concentrate on what matters to you and your operation. If possible, use the same benchmark for all of your testing or evaluation. It makes actual, true comparison so much easier.

Posted by conradb212 at 7:50 PM

October 1, 2019

How the rugged PC "drop spec" just became different

The Department of Defense significantly expands drop test definitions and procedures in the new MIL-STD-810H Test Method Standard

You wouldn't know it from looking at almost all of the published ruggedness testing results, but the good old DOD MIL-STD-810G was replaced by the MIL-STD-810H in January 2019. And if you thought the old MIL-STD-810G was massive in size, the new one is bigger yet. While the old MIL-STD-810G document was 804 pages, the new one has 1,089. That's 285 extra pages of testing procedures.

The new standard brings quite a few changes in ruggedness testing. In this article I'll take a first look at one of the marquee tests as far as rugged mobile computing equipment goes, the transit drop test. It was described in Method 516.6 Procedure IV in the old MIL-STD-810G, and now it is under Method 516.8 Procedure IV in the new MIL-STD-810H. The drop section has grown from two to six pages, and there are some interesting changes.

Drop surface: Steel over concrete

The basic approach to drop testing remains the same. Items must still be tested in the same configuration that is actually "used in transportation, handling, or a combat situation." What has changed is how testing is done. Procedures are described in more detail.

Under the old MIL-STD-810G, lightweight items such as mobile computing equipment had to be dropped onto two-inch plywood backed by concrete, and only really heavy gear weighing over 1,000 pounds directly onto concrete. That has changed. The DOD was apparently concerned about repeatability of results, and surface configuration can have a substantial impact on that. The DOD wanted to test the most severe damage potential, that of "impact with a non-yielding mass that absorbs minimal energy."

Since plywood hardness can change and affect results, under MIL-STD-810H the default impact surface is now steel plate over concrete. For mobile computing gear purposes, the steel plate must be at least an inch thick, have a Brinell hardness of 200 or more, the concrete underneath must be reinforced, have a minimum compressive strength of 2500 psi, and the steel must be bonded or bolted onto the concrete to form a uniform, rigid structure.

This seems drastic. Clearly, what the DOD is concerned here is that it is the dropped item that must absorb the impact and not the surface that something is dropped on, hence the steel instead of plywood. Fortunately, the DOD realizes that there is a big difference between dropping light and very heavy gear, and also what something might fall on. On an aircraft carrier, for example, it would almost always be steel. Out there in the field, it's more likely asphalt or dirt.

So the supporting notes allow that "concrete or 2-inch plywood backed by concrete may be selected if (a) a concrete or wood surface is representative of the most severe conditions or (b) it can be shown that the compressive strength of the impact surface is greater than that of the test item impact points." Whew. But steel is actually a very good idea.

Different drop scenarios

The MIL-STD-810H recognizes that not all drops are equal. The "transit drop" in the old MIL-STD-810G seemed to primarily assume that something falls off a truck when loading and unloading equipment. This really didn't have much relevance to the most likely drop scenario in mobile computing -- dropping a tablet or laptop while using it in the field. So whereas the old standard only had one drop scenario -- transit drop -- the new MIL-STD-810H standard has three.

There is the "logistic transit drop test" that states drop conditions for "non-tactical logistical transport" i.e. dropping things off the proverbial truck. The "tactical transport drop test" includes scenarios associated with "tactical transport beyond the theatre storage area. And, finally, there's a "severe tactical transport drop test" where items pass as long as they do "not explode, burn, spread propellant or explosive material as a result of dropping, dragging or removal of the item for disposal."

Now what does all that mean?

The "logistic transit drop" test is, in essence, the same as the old MIL-STD-810G transit drop test. Items where the largest dimension is no more than 36 inches (i.e. all mobile computing gear) must pass 48-inch drops on "each face, edge, and corner; total of 26 drops." These 26 drops may be among no more than five test items.

The "tactical transport drop" offers five scenarios (ship, unpackaged, packaged, helicopter, and parachute), where only one scenario applies to conventional ruggedness testing of mobile computing gear "unpackaged handling, infantry and man-carried equipment." There, the drop height is five feet. And instead of 26 drops, there are five standard drop orientations (drop to flat bottom, left end, right end, bottom right edge at 45 degrees, and top left end corner at 45 degrees), with each item exposed to no more than two drops.

The "severe tactical transport drop" test really only applies to items being dropped from significant heights, like helicopters, aircraft, cranes and such. There, the drop height starts at 7 feet (falling out of a helicopter, unpackaged) to 82 feet (shipboard loading onto an aircraft carrier). This rarely applies for the purposes of rugged mobile computers, but for bragging rights, some makers of rugged computing gear will undoubtedly perform those tests (you know who you are!).

Guidance systems

If you've ever wondered how one can drop an item so that it lands exactly on a mandated edge or corner, the MIL-STD-810H offers help and a suggestion. "Guidance systems which do not reduce the impact velocity may be employed to ensure correct impact angle, however, guidance shall be eliminated at a sufficient height above the impact surface to allow unimpeded fall and rebound." We've never seen any such system, and it should be interesting if someone will come up with such a drop test guidance mechanism.

Evaluating results

Now how does one perform, document and evaluate the drops? That has always been the weakest point in drop testing, and the new MIL-STD-810H brings no change.

It's simply recommended to "periodically" examine an item visually and operationally during the testing, to help in the follow-up evaluation. It's recommended to document the impact point and/or surface of each drop and any obvious damage. And that's it.

So although the "drop test" section in the new MIL-STD-810H has grown considerably in length, new categories have been added, and testing methods mandated in more detail, much remains quite vague. That's to be expected of a testing procedure that covers a vast variety of different items that may be packaged, unpackaged, and ranging in weight from very light to thousands of pounds. And so the section ends with "conduct an operational checkout in accordance with the approved test plan."

My recommendations

What do we learn from all this? In essence, that it all depends, even under the more detailed guidelines of the new MIL-STD-810H.

What's important with mobile computing gear is that it still works after you drop it. Whether it's operating during the test or not used to matter back when mobile computers had hard disks. With solid state storage it functionally really doesn't matter whether testing is done with the system on or not.

I suggest doing the tests with the unit operating. That way it's immediately obvious if a drop killed the unit or not. And leaving it on alleviates the need to start it up after every drop.

I would not split the mandated number and types of drops on five units. That makes no sense. Instead, I would do all mandated drops on the same unit, but do all those drops on five successive units. All five must pass, i.e. still work after all drops. That way, one can be reasonably sure that results were not a fluke.

As far as the surface goes, most units will fall on carpet or wood indoors, and natural surfaces all the way to concrete outdoors. Using the suggested steel over concrete shows you're taking the testing serious. Using plywood is a bit of a cop-out because it's much softer and less likely to damage a unit. And the hardness of plywood varies.

Doing the 5-foot "tactical transport drop" test definitely gets extra credit. When using a handheld computer as a phone it'll fall from five or six feet. And even tablets can easily fall from more than four feet. Besides, the "tactical transport drop test" for "infantry and man-carried equipment" (presumably the DOD means women-carried as well) much better describes the real world drop scenarios of mobile computing gear.

Passing the "severe tactical transport drop" test that starts at seven foot drops gets extra credit.

So hit the books and study the pertaining sections of the new MIL-STD-810H. You don't want to be the last to switch. -- Conrad H. Blickenstorfer, September 2019

Posted by conradb212 at 6:06 PM

July 6, 2019

The uneven performance of cameras in rugged handhelds and tablets

Recently I ran the usual set of integrated camera test pictures for our RuggedPCReview.com product testing lab with four devices all at once. That meant taking two pictures each of the 20 or so test settings around our offices in East Tennessee. The settings represent some tasks that users of rugged handhelds and tablets might do on the job. Examples are meter reading, capturing information from accessible and not-so-accessible labels, markings, or instructions, and so on. For a splash of color and to test close-up performance we also include pictures of flowers and greenery.

The reason why we take two pictures of each subject is because, on the job, one doesn't always have time to carefully set up a subject and baby the shot. It's very much point-and-shoot. On the job one might take two shots of whatever information is to be captured, just to make sure. So we do that, too. That cuts the possibility of getting a lousy shot in half. And lousy shots are still possible, even with the latest cameras and imagers.

Back in the lab, we examine the 20 or so pairs of pictures and select the better of each pair. We then pick nine representative shots and arrange them in a 3 x 3 picture image compilation. We save that compilation full-size as well as down-sampled to fit into our web page templates, both at 72dpi and 144dpi.This way, viewers can click on the screen image to load the full-size compilation and, if so desired, download that onto their computer for closer examination.

Cameras integrated into mobile computers aren't new. They've been around for two decades or more. In fact, rugged handheld computers and tablets had cameras even before cameras became an integral (and many say the most important) part of every smartphone. That's because rugged mobile computers are, in essence, data collection devices, and image data is part of that. Unfortunately, early such integrated cameras were often quite useless. Many were terrible and in no way up to the jobs they were supposed to do.

We've pointed this out again and again over the years — first in Pen Computing Magazine and now in RuggedPCReview.com, and heard all the reasons and excuses why integrated cameras weren’t any better. It took the iPhone and then the global smartphone revolution to demonstrate that tiny cameras in very small devices could be not only acceptable, but very good. So good, in fact, that smartphone cameras have replaced the dedicated point & shoot camera market. And so good that here at RuggedPCReview.com even for product photography we've been using smartphones and no longer dedicated cameras for the last three years.

Unfortunately, while smartphone cameras went from very good to excellent to downright stunning, such progress was slow in translating into similar improvements in integrated cameras. Cameras built into rugged mobile devices have become much better thanks to the global proliferation of smartphones and tablet technology. Better imagers are available at lower costs, and that means that we're now seeing SOME rugged devices with very decent cameras. Decent, but, with some notable exceptions, certainly not stunning like in the better smartphones.

Why is that? And why does this problem persist? We don't know. The primary excuses for mediocre integrated camera performance used to be size, cost, and lack of OS support. That doesn't wash anymore. If sliver-thin smartphones can take terrific pictures, it can't be size. Imager cost, likewise, has come way down. But how about operating system support? That may be the most likely reason.

It has long baffled us why cameras integrated into Windows tablets perform nearly as well as imager specifications suggest. By far the most likely culprit there is the truly awful generic Windows Camera app. Yes, there's the ever-present driver support issue, and supporting all sorts of different imagers in all sorts of different computer hardware isn't easy. But even that cannot justify the awfulness of the Windows Camera app that usually lacks, well, just about everything. The feeling we often get when testing integrated cameras is that the imager could do much better, but not when it has to deal with the stark, ultra-basic Windows Camera app that supports almost nothing you'd expect from a camera.

Compare the imaging wizardry that can be done with today's smartphones with the sad nothingness of the Windows Camera app. The gulf couldn't be larger. Things are considerably better on the Android device side, because Android was designed from the start as a smartphone OS. And that included decent camera software. Android, of course, also has to deal with all sorts of different imagers and hardware, but there's the economy of scale. Android is absolutely dominant in smartphones and that means literally billions of devices, and thus massive software developer support. As a result, rugged Android devices almost always outperform rugged Windows-based devices in imaging operation and quality. By a considerable margin.

What's the solution? We don't know. Maybe developing a decent camera app simply isn't economically feasible for the relatively low production runs even of popular rugged devices. Maybe the assumption is that almost every user will have a smartphone in their pocket anyway, and use that for serious photography. But then why even bother with putting integrated cameras into devices? Are they just a checkbox item for government requests for proposals?

But how did we make out with testing those four devices? For the most part, the resulting pictures were much better than what was possible in the past. Some of them were even excellent. And each device was capable of capturing images and video good enough for the job of documenting project-related information.

That said, the one-size-fits all nature of the generic Windows and Android camera apps can be very frustrating when it's so obvious that the software is holding back the hardware. If the software cannot take full advantage of imager specs, when the lack of settings and functionality results in blurry pictures because the software can't properly focus or adjust for lighting, movement, contrast or type of light, then it hardly makes any sense to use the camera at all.

It pains me to be so negative about this, especially since some progress has been made. But it just is not enough. At a time when smartphones offer 4K video at 60 frames per second, 1080p/30fps simply isn't enough. Sluggish, lagging operation isn’t acceptable. And it makes no sense for camera software to not even support the full resolution of the hardware, or only in certain aspect ratios. It was terrible software that killed off the dedicated point & shoot camera market when smartphones came along, and it is terrible software that continues to make so many integrated cameras essentially useless.

This really needs to be addressed. As is, Android is way ahead of Windows in making good use of imaging hardware built into rugged mobile computing gear. Bill Gates recently mentioned that one of his greatest regrets is not to have taken the mobile space seriously and let Android become the (non-Apple) mobile systems OS winner. It seems Microsoft still isn’t taking the mobile space seriously. — C. Blickenstorfer, July 2019

Posted by conradb212 at 3:22 PM

May 17, 2019

Android contemplations 2019

In this article I’ll present some of my thoughts on Android, where it’s been, and where it’s going. I’ll discuss both my personal experiences with Android as well as my observations as Editor-in-Chief of RuggedPCReview.com.

Although my primary phone has been an iPhone ever since Steve Jobs introduced the first one, and although my primary tablet is an iPad, and also has been ever since the iPad was introduced to a snickering media that mocked the tablet as just an overgrown phone without much purpose, I am inherently operating platform agnostic.

I do almost all of my production work on an iMac27 because, in my opinion, there's nothing like a Mac for stress-free work, backup, and migration. But I also use Windows desktops because Windows continues to be, by far, the dominant desktop and laptop OS. And because all laptops and many tablets in the rugged mobile computing industry that I cover use Windows. I have a Linux box because I want to stay more or less up-to-date on an OS that so much is built on, even if few know what all is using Linux in the background. I use Android because Android is, by far, the dominant OS on smartphones, and it's a major factor in tablets (and Android is built on Linux).

Over the course of my career I've seen lots of operating systems come and go. The Palm OS once ruled the mobile space. Microsoft had Windows CE in its various incarnations. There was EPOC that later became Symbian. Android actually predated the iOS, though the first commercial version didn't appear until 2008. In 2009, Motorola's Droid commercials alerted the public to Android, but few would have guessed its impending dominance. By 2010, 100,000 Android phones were activated every day, but in the rugged space Android remained largely unknown. I did a presentation on trends and concepts in mobile computing at a conference in Sweden where I alerted to the presence of Android and its future potential in rugged mobile devices.

In August 2010, I bought an Augen GenTouch78 for US$149.95 at K-Mart to educate myself in more detail about the potential of Android in tablets and presenting my findings on RuggedPCReview.com (see here). The little 7-inch Augen tablet, one of the very few Android tablets available back then, ran Android 2.1 "Eclair" and showed promise, but it didn't exactly impress. So while Android had already gained a solid foothold behind iOS and Symbian in smartphones, its future in tablets still seemed uncertain.

Android's fortunes in rugged tablets appeared to change in 2011 when, to great fanfare, Panasonic introduced the Toughpad A1 at Dallas Cowboy Stadium in Arlington, Texas. Getac followed in 2012 with the Z710, and Xplore in 2013 with the RangerX. None of these lived up to sales expectations. In light of the massive success of Android in smartphones, the problems Android had in establishing itself in tablets seemed hard to understand. In hindsight, it may have been a combination of Google's almost 100% concentration on phones and Microsoft's rediscovery of touch and tablets with the introduction of Windows 8 and, from 2012 on, the Microsoft Surface tablets.

In the ruggedized handheld space, driven by the success of Android-based smartphones, the situation began to change. After having been beaten to the punch by smaller, nimbler companies such as the Handheld Group, Unitech and Winmate, both Motorola Solutions and Honeywell began placing emphasis on Android. Both announced their commitment to Android late 2013, albeit initially by simply making available some of their devices either with Windows Mobile or with Android. For the next several years, offering both Windows Embedded Handheld (the renamed Windows Mobile) versions of handhelds became the norm. It was a (not very cost-effective) case of hedging bets and offending no one.

By 2016 it became obvious that Microsoft had either lost the mobile war or had simply lost interest in it. The situation had been quite untenable for several years where Microsoft had essentially abandoned Windows Embedded Handheld and concentrated on Windows Phone instead. Windows Phone, later strangely renamed to Windows Mobile, didn't go anywhere either, and so Microsoft called it quits.

It was time for me to get another personal Android device to see how the platform had developed on the tablet side. My choice was the Dell Venue 8 7000 Series, a gorgeous 8.4-inch tablet running Android 5.1. Beginning with its super-sleek design, its wonderful 2,560 x 1,600 pixel OLED display, its four (!) cameras, and its excellent battery life, the Dell tablet was simply terrific (see my coverage in RuggedPCReview.com). And Android certainly had come a very long way. Dell, however, didn't stay long in the Android tablet space. The fact that I was able to get it at half of its original price should have been an indication.

Around that time I also began writing Android user manuals for some clients. This made me realize one major weakness of Android. Its user interface, while suitable for simple phone tasks, is hopelessly convoluted and everything changes with every new rev. Features, functions, screens and menus are forever haphazardly rearranged and renamed, making Android needlessly frustrating and complex to use. Adult supervision is needed to rein in that endless fiddling around with the Android UI. And with every new rev, Google's presence becomes more heavy-handed in Android. "Neutral" apps are being replaced with Google apps, and there is ever more pressure to use Google accounts and Google services. Google was, and is, taking over.

When my Dell tablet gave up the ghost a few months ago, I began looking for another personal Android device. This time, for a variety of reasons, I wanted a phone and not a tablet. This would give me a chance to get a very recent version of Android, and also the opportunity to see where your average consumer smartphone stands compared to rugged Android devices. While I was at it, I felt this was also a good time to get to know one of the rapidly growing Chinese smartphone giants, namely Huawei (Wowie? Wha-Way? Whaa-wee? Huu-Ahh-Way?).

So in May 2019 (just a few days before Google's sudden termination of its relationship with Huawei) I purchased a Huawei P30 Lite smartphone, a lower-cost version of the company's P30 Pro. Just like the iPhone, Huawei's latest phones all look almost identical, but there are fairly substantial differences as well. Screen sizes vary, as do cameras, materials, batteries and technologies. My (unlocked) P30 Lite cost just $300. It's between the iPhone XS and XS Max in size, but is both lighter and a bit thinner than even the smaller iPhone XS.

The P30 Lite runs Android 9.0.1, close to bleeding-edge. The 6.15-inch IPS LCD display is bright enough (450 nits as measured) and its 1080 x 2312 pixel resolution translates into a stellar 415ppi. The phone comes with 4GB of RAM, 128GB of internal storage, and a micro SDXC slot that can handle up to 1TB cards.

There are four cameras: a 24MP f/2.0 selfie camera, a 24MP f/1.8 main camera (and EU versions even a 48MP camera!), an 8MP ultrawide camera, and a 2MP "depth sensor" camera for adding bokeh effects. Despite the very high resolution, video is limited to 1080p/30fps, probably so as not to compete with the higher-end models. There are two nano-SIM slots, but one shares space with the micro-SD card, so you can't have both.

There's a reversible USB Type-C port, a separate standard 3.5mm audio jack, a non-removable 3,340mAH Li-Polymer battery, 802.11ac WiFi, NFC, and Bluetooth 4.2. The whole thing is powered by an octa-core Hisilicon Kirin 710 chipset (4 x 2.2GHz Cortex-A73 and 4 x 1.7GHz Cortex-A53) made by Huawei itself and running up to 2.2GHz. The body looks quite high-rent, though it's plastic and not metal like the higher level Huawei P30 phones.

There are no references to ruggedness at all, and the P30 Lite comes with a transparent protective boot and also a screen protector. So presumably no splashing in the water or rough handling. The phone supports face recognition and also has a fingerprint scanner in the back. That may sound an odd placement, but it works great.

Since this is pretty much a bargain brand-name phone and doesn't have the same high-powered chipset as the top-of-the-line models, how well does it perform? Very well, actually. In the AnTuTu and PassMark Mobile benchmarks it scored a good 50% faster than any of the recent rugged Android handhelds and tablets we tested. Just another example of how consumer tech remains vexingly ahead of anything longer-lifecycle/lower sales volume rugged devices can do.

So the issue remains. Consumer tech is fast, advanced, light, glossy and, unless you go ultra-premium, cheap. There's a protective case for anything and everything. If a company buys a thousand devices like this Huawei P30 Lite, it'll cost $300,000. If it goes for dedicated rugged devices, it may cost a million. Or not (the rather rugged and well-sealed Kyocera DuraForce PRO 2 we recently tested ran just $400). Then it comes down to issues like service, warranties, ease of replacement, and so on. Personally, I very much believe in spending money for high-quality tools made for the job, but it's easy to see the lure of consumer tech.

So where will things go from here? Android will almost certainly maintain its dominance in smartphones. Unless Microsoft pulls an unexpected rabbit out of its corporate hat, pretty much all enterprise and industrial handhelds will be Android for the foreseeable future. I fear that Google’s increasingly heavy presence all through Android will become a more pressing problem. As will the lack of a clean, compelling “professional” version of Android (the current Android AOSP, Android for Work, etc., are insufficient).

With tablets it’s more difficult to tell Android’s future trajectory, mostly because of definitions of what counts as a tablet. I’d guess 1/3 iOS, 1/3 Android, and 1/3 Windows, but many statistics suggest otherwise. statista.com, for example, sees it as 58% iOS, 26% Android, and 16% Windows for worldwide tablet OS share in 2018. Android’s future in tablets may also depend on who will come out on top in Google’s current internal squabble between Android and the Chrome OS.

Version fragmentation remains a serious weakness in Android. And since a lot of devices cannot be updated and version support ends quickly (Version 6, Marshmallow, is already no longer supported by Google), the roughly annual new major version release obsoletes Android hardware very quickly. That’s no problem for consumer tech. But is definitely an issue for vertical and industrial markets where manufacturers need to decide what to build and support, and customers in what and whom to place their trust.

Posted by conradb212 at 6:29 PM

March 25, 2019

The Drone in Your Future

Drones have made tremendous progress and will likely become part of many jobs.

In this article I will provide a brief overview of drone technology, and the application and impact it may have on field operations. Because, believe it or not, drones may well become part of your job.

Not too terribly long ago I came across an article and an accompanying YouTube video by a man who had attached a small digital camera to a remote controlled toy helicopter. The contraption worked well enough to yield video from high above the trees around his house. It was a fun project, the man felt, and there might actually be practical application for such a thing. But most who watched the video probably dismissed it as just a weird hobbyist science project.

What the man may or may not have known was that he may well have cobbled together one of the first "drones" as we know them today. I say "as we know them today" because unmanned aircraft that could shoot pictures have been used as far back as the Spanish American War in the late 1800s, back then mounted to a kite. The actual term "drone" apparently dates back to the mid-1930s when unmanned remote-controlled vehicles made the buzzing sound of hive-controlled male bees, or "drones."

Drones as we know them today are of much more recent vintage. They came about when "action cameras" like GoPros became small and light enough to be mounted onto inexpensive little helicopters, usually with four rotors. Available both as cheap toys and also as more serious flying cameras, drones quickly gained a reputation as intrusive pests used by paparazzis and other intruders of privacy, and also as a danger to real aircraft.

That has gotten to a point where the use of drones is more and more regulated, and sadly in a haphazard and uneven way that frustratingly varies from place to place.

So let's see where drones are today and where they may be headed. Yes, you can pick up a cheap drone at every Walmart or similar store for very little money. Those do take video and they do fly, but they don't do either well. As a result, most quickly crash or get lost. They may be kids toys, but since most lack any degree of stabilizing electronics, they are actually difficult to control and fly.

Those willing to spend more money will find a rapidly increasing degree of sophistication. Here at RuggedPCReview.com we invested in one of the latest drones of market leader DJI (which commands almost 3/4 of the market). The Mavic 2 Zoom retails for US$1,299, which is in the range of more serious consumer digital cameras.

For that you get a remote-control quad-copter drone with a 12-megapixel camera with 2x optical zoom and full gimbal movement. The Mavic 2 weighs about two pounds, its four rotor arms, each with its own little electric motor, twist away for compact storage, and it can fly for about half an hour on a charge of its removable 60 watt-hour battery. The remote controller works in conjunction with an iPhone or Android smartphone, which mean you'll see live video right on your phone, and it can even live-stream it to social media.

Now lest you think drones like the Mavic 2 are still just toys, this thing can go as fast as 45 miles per hour, it can go as high as 1,600 feet, it can be up to five miles away from the controller, and it has a (battery-limited) flying range of about 11 miles.

Thanks to GPS and numerous sensors, the Mavic 2 drone also has amazing smarts. It can follow preset GPS coordinates/waypoints. And it recognizes obstacles and will navigate around them. It has all sorts of programs, like following the pilot, fly circles around a still or moving object, do special effects, and so on. And it also has a bunch of LEDs so it can be seen in bad light, and also to communicate status.

How difficult is it to control a drone like the Mavic 2? That depends on one's dexterity and how much learning time is invested. Any video game player will be right at home with the controller. There are different flight modes, including a restricted beginner mode. There are helpers such as auto-launch, and also auto-return and auto-land. What perhaps is most amazing is how rock-solid the Mavic 2 is in the air. If you let go of the controls, it sits still in the air as if on a tripod. Even modest wind doesn't faze it. You can leisurely peruse the scenery from high up there, looking any which way you want.

But now let's look at what all that means for field workers and the typical users of rugged mobile computers.

Quite obviously, being able to control a camera that can fly has enormous potential. It can easily go anywhere and anyplace where ladders, scaffolds, and potentially dangerous trekking would otherwise be needed. That makes it ideal for inspections, damage assessment whether it's the big picture from way above or from very close-up. It can be used to take breathtaking video of real estate, natural settings, bridges, tall buildings and plenty more. It can go inside of structures that may be unsafe. The application potential is endless.

Sounds almost too good to be true, doesn't it. But it's all real, and it's only going get to get better and ever more sophisticated. The small 12-megapixel camera not enough? For a mere $200 more, DJI offers the same quadcopter setup with a 20-megapixel Hasselblad camera. Need a bigger drone with some payload capacity? They are available. Need a bigger screen? You can hook the controller up to a tablet. Is the screen not bright enough? DJI offers a controller with its own sunlight-viewable display.

And then there are the laws and regulations. Even for a small drone like the DJI Mavic 2, the controller insists on downloading the latest regulations and restrictions before each and every flight. If too close to a sensitive area, one may have to get permission to fly first. And anything that is considered commercial use of a drone requires a license. It's not terribly difficult to get, but still requires studying and passing a test. And depending where you are, the law of the land may make little legal difference between flying a drone and flying an actual aircraft.

That all said, it's easy to see that drones will likely become tools for the job for many field and other workers who now use rugged mobile electronics. So the sooner you start acquainting yourself with drone technology, the better off you are. -- Conrad Blickenstorfer, March 2019

Posted by conradb212 at 4:28 PM

January 24, 2019

How bright is your screen?

If you take a handheld computer or a tablet or a laptop outdoors and on the job, it's really important whether you can still see what's on the screen clearly enough to actually use the computer.

Whether you can or not depends on a lot of things, like how bright it is outside, whether there are reflections, the size of the screen, its sharpness and contrast, viewing angles, and so on. Experts calculate the "effective" contrast ratio in bright outdoor light by estimating sunlight compared to the light that's reflected back by the typical computer screen with its various treatments and several reflective layers. How well those internal reflections are controlled determines how readable the screen remains in sunlight.

With current display technology, of equal or perhaps even greater importance is the backlight. A strong backlight is generally better than a weak one, I say "generally" because a super-strong backlight can make a screen look washed out. That happens when the black pixels cannot block a backlight that's too powerful for a given screen technology. And, of course, a strong backlight drains the battery much more quickly.

Nevertheless, backlight strength is very important to outdoor and sunlight readability. But what exactly constitutes a "strong" backlight?

Quantifying light is never easy. Back in the day everyone knew that a 100 watt lightbulb was bright, a 60 watt so-so, and 40 watt was best used where you needed some but not a lot of light.

But that was before the short-lived era of spiral fluorescent light "bulbs" that were "100 watt-equivalent," and before the current era of LED lights that are also still sold by how many watts equivalent to an old incandescent bulb they generate. The brightness of LED bulbs is also stated in lumens and sometimes lux. Figuring out what lumens ("the total quantity of visible light emitted by a source") and lux ("a unit of measurement for illuminance") mean is too complicated to be useful in real life. And so for as long as people remember how bright a "real" 100 watt lightbulb was, newer technology light bulbs will probably sold as so and so many watts "equivalent."

What does all of that have to do with backlights? Not that much. Only that describing the strength of a backlight is just as complex and confusing as it is with light bulbs. So how is it handled?

In essence, the light emitted by a display backlight is given in a unit called candela per meter squared, or cd/m2. Candela is both the average light of one candle and, per Google dictionary, "the luminous intensity, in a given direction, of a source that emits monochromatic radiation of frequency 540 × 1012 Hz and has a radiant intensity in that direction of 1/683 watt per steradian." So there. Now take that per meter squared. Auugh.

Given this obvious complexity, someone in the industry at some point suggested calling cd/m2 simply "nit." So 100 cd/m2 became 100 nits. Maybe "nit" is short for "unit." No ones seems to know. Today, many display specs include a nits rating.

Even technically inclined folks often confuse luminance with illuminance. Illuminance is the amount of light striking a surface. Luminance is the brightness we measure off of a surface which is hit by light. So for our purposes, a light source behind the screen lights up the screen. How bright that light makes the surface, its illuminance, is measured in nits.

How bright is a nit? Or 100 nits? Since 100 nits is 100 candela per square meter, imagine a hundred candles sitting underneath a roughly 3 x 3 foot square. How bright would that be? I have no idea. So perhaps it's better to think of things that, on average, generate so and so many nits and go by that. Standard laptops generate about 200 nits. A good tablet or smartphone between 500 and 600 nits. Rugged laptops can generate as much as 1,500 nits, as can modern 4K HDR TVs.

What makes everything more difficult is where we view an illuminated surface. Even a 180 nits laptop can look bright and crisp indoors. Outdoors that same laptop would be barely readable. Outdoors the weather makes a big differences, as does being in the shade or under a blue or cloudy sky. And when it comes to competing with the sun, all bets are off. The sun generates between 10,000 and over 30,000 nits.

So for better or worse, to get an idea how bright the screen of a handheld, tablet or laptop is, look at its nits rating. Which, unfortunately, is listed only in a minority of spec sheets.

That's where a screen luminance meter comes in. The one we use here at RuggedPCReview.com has a range up to 40,000 nits and can show current or peak luminance of a display. We use it in conjunction with a test template to not only record maximum luminance in nits, but also nits readings in steps from black to white.

Until something better comes along, every handheld, tablet or laptop screen spec should include a nits rating. Customers need to know that before making a purchase decision. -- Conrad Blickenstorfer

Posted by conradb212 at 5:39 PM

September 13, 2018

Ditching laptops for tablets — evolution or disruptive paradigm shift?

We do product testing adventure trips two or three times a year. That means packing a lot of equipment, including chargers, memory cards of various sizes and standards, all sorts of cables and adapters, suitable software, batteries and their chargers, and whatever it takes to keep the whole thing running. At the center of it all, on every trip, was a laptop powerful and competent enough to handle whatever came our way.

Not this time. We spent two weeks in the central American nation of Honduras, on one of its channel islands, Roatan. Roatan, even now, remains wild and primal, and is, compared to almost every other such tropical destination, virtually untouched by tourism. Sure, there are small resorts here and there, but, by and large, there is no development. There is one road. No movie theater. No healthcare facilities to speak of. And this time we left the laptop at home. Instead, tablets and smartphones come with us. How did that work?

The jury is still out.

First I should say that we did still take a laptop with us, a small 11.6-inch Acer netbook. The little thing was the successor of earlier and even smaller Acer Aspire One netbooks that we’d used for years. Those, though powered by basic Atom processors, were amazingly useful. They easily handled all the many hundreds of photos we shoot on every trip, kept us in touch with civilization, and just generally did what we needed them to do. Initially, the larger little Acer had done even better, but that was several years of Microsoft “progress” ago. On this trip, the Acer, upgraded to the latest version of Windows 10, couldn’t get out of its own way it had become so slow.

Our big Office Apple McBook Pro, likewise, stayed at home. It’d become the victim of the “retina revolution” that had rendered its pre-retina screen unacceptably corse to our eyes. And it weighs a ton. Not really, but compared to newer and sleeker gear.

So this time it was all iPads and iPhones to handle all the writings, test results, observations, and all the video and still images we took both above and under water.

The big thrill: viewing your pics and video on the big 12.9-inch iPad Pro was glorious. It makes everything look even better than it actually is. And image and video manipulation apps have progressed to a point where you can do some tasks faster, better, and more intuitively than on a laptop. Tablets are also easier to take along, pass along, and stow away than laptops. So from that perspective we didn’t miss laptops at all. Oh, and sharing pictures and videos among ourselves and with newly-made friends es ever so easy with Apple AirDrop.

But not all was well. Copying from camera to tablet or phone was a total drag. Much of it was Apple’s fault for making it ridiculously difficult to use third party add-one. Like, a multi-card adapter with a lightening plug, specifically sold to work with Apple gear, was coldly rejected by our iPhones and iPads: “Peripheral not supported.” Another such device required downloading a special utility to work with the Apple gear and the documentation for that was marginal to an extent where a lot was lost in translation. Apple stuff tends to be simple to use, unless you have to use it with something non-Apple.

And, of course, iPhones and iPads have no card slots. So in this day and age of 4k video, even 256GB Apple gear quickly fills up. And then what?

Then there’s the app barrier. While tech pundits love to declare desktops and laptops dead and we’re all post-PC, that ain’t so. Most serious work is still done at the desk with something more precise than the tip of a finger. Which means there is a vast user-interface and experience gulf between how things are done on a tablet and how it’s done on the desktop. Things that take a minute on conventional equipment can take half an hour on a tablet, or it can’t be done at all. Or take typing, the bread and butter of reporting and creating content. No tippedy-tap finger typing can replace a good physical keyboard. And being limited to using the good-looking but amazingly stupid Apple onscreen keyboard is infuriating.

So it was a struggle. We spent way too much time uploading pictures and video, keeping track of it all on a non-accessible file system, using apps that sometimes worked and sometimes didn’t, or suddenly were no longer supported at all.

At almost any time we felt we were caught between two paradigms, that of how productivity used to work and that of the new world of billions of smartphones.

So the jury is indeed still out. Old gear simply does not work anymore. Microsoft, and technological progress in general, have seen to that. Brand-spanking new gear and software doesn’t work very well in the tropics and on adventure trips, because all that new stuff wants the internet constantly, and make it pronto and super fast, or else you’re dead in the water.

Being back in the office felt like a return to the past. Uploading pictures, using desktop software, cataloging files, fixing things. Many images and vids that had looked glorious full-screen on the big iPad suddenly looked much less impressive and sort of lost on our big iMac27 desktops where a million other things are open and vying for attention.

Is what we experienced the hallmark of disruptive technology? One that has not evolved smoothly from what was there before, but leaped ahead to another way of doing things? One that cannot be smoothly bridged no matter how you try. One that means leaving skills and methods and experience behind and start over? It does seem that way. And I say that as someone who uses both “old” and new every day. -- Conrad H. Blickenstorfer, September 2018

Posted by conradb212 at 6:21 PM

August 15, 2018

Why ruggedness testing matters

Ruggedized mobile computing gear costs more than standard consumer technology, but in the long run it often costs less. That’s because rugged computers don’t break down as often, they last longer, and there isn’t as much downtime. What that means is that despite the higher initial purchase price, the total cost of ownership of rugged equipment is often lower.

That, however, only works if ruggedized products indeed don’t break down as often, indeed last longer, and indeed do not cause as much downtime. Ruggedness, therefore, isn’t just a physical thing. It’s an inherent value, an implied promise of quality and durability. And that makes ruggedness testing so important.

Interestingly, ruggedness testing is entirely voluntary. While computers must pass stringent electrical testing before they can be sold, ruggedness testing isn’t officially required anywhere. It isn’t regulated. Electrical testing makes sure a computer adheres to standards, will not interfere with other equipment, and meet a wide range of other requirements. Why not ruggedness?

It’s probably because electric interference can affect third party equipment and systems, and possibly do harm, whereas ruggedness “only” affects the customer. And it’s also because unlike electrical interference standards that are absolutes, the degree of ruggedness required depends on the intended application. In that sense, the situation is similar to the automotive field where there are strict governmental testing requirements for safety and emissions (which affect third parties) but not for performance, comfort or handling (which only affect the customer).

Given the voluntary nature of ruggedness testing, how should it be conducted, and how relevant are the results of that testing?

For the most parts, testing is performed as described in the United States Department of Defense’s “Environmental Engineering Considerations and Laboratory Tests,” commonly known as MIL-STD-810G. In some areas, testing is done in accordance with a variety of different standards, such as IEC (International Electrotechnical Commission), ANSI/ISA and others. By far the most often mentioned is the MIL-STD-810G.

So what, exactly, is the MIL-STD-810G? As far as its scope and purpose go, the document says, “This standard contains materiel acquisition program planning and engineering direction for considering the influences that environmental stresses have on materiel throughout all phases of its service life.” Minimizing the impact of environmental stresses, of course, is the very purpose of rugged design, so using the MIL-STD-810G as a guide to accomplish and test that makes sense.

How does the MIL-STD-810G go about its mission? In the document’s foreword, it says that the emphasis is on “tailoring a materiel item's environmental design and test limits to the conditions that the specific materiel will experience throughout its service life, and establishing laboratory test methods that replicate the effects of environments on materiel, rather than trying to reproduce the environments themselves.”

The MIL-STD-810G is huge and often very technical, as you’d expect from a document whose purpose is to provide testing for everything that may be used by the US Department of Defense. The operative term, therefore, is “tailoring.” Tailoring the tests to fit the conditions a specific item or device may encounter through its service life. That means it’s up to manufacturers to knowledgeably pick and choose the tests to be performed.

It also means that simply claiming “MIL-STD-810G approved” or MIL-STD-810G certified” or even “designed in accordance with MIL-STD-810G” means absolutely nothing. Not even “passed MIL-STD-810G testing” means anything unless accompanied by an exact description what testing was performed, to what limits or under what conditions.

So how do we go about ruggedness testing that truly matters? First by determining, as the MIL-STD-810G suggests, the conditions that a specific device will experience in its service life. Let’s think what could happen to a rugged handheld computer.

It could get dropped. It could get wet. It could get rattled around. It could be exposed to saltwater. It could get crushed. It could be used in very hot or very cold weather. It could get scratched. It could be used where air pressure is different.

That’s the important stuff. There may be other environmental conditions, but for the most part, this is what might happen to a handheld. And it must be able to handle those conditions and events while remaining functional.

It’s instantly obvious that it’s all a matter of degree. From what height can it fall and survive? How much water can it handle until it leaks? How much crunching force before it cracks? How hot or cold can it be for the device to still work? How much vibration before things break loose? And so on. It’s always a matter of degree.

So how does one determine the degree of ruggedness required? Deciding that requires thinking through possible use scenarios and then, applying common sense, arrive at the appropriate level of protection. Too little and it may break, which is bad for both the customer and the reputation of the manufacturer. Too much and it may become too bulky, heavy and expensive.

Ruggedness is a compromise. Nothing is completely invulnerable, freak accidents can and will happen, and figuring out how rugged is rugged enough is an educated judgement call, one based on knowledge and experience. Facilitating the desired degree of ruggedness is an act of balance and good design. And that, again, requires experience.

So how might one go about determining the proper degree of ruggedness? Here are some of the considerations:

A handheld computer will get dropped. It’s just a matter of time. What’s the height it may get dropped from? I am six feet tall. When I stand and use a handheld, the device is about 54 inches above ground, 4-1/2 feet. If it falls out of my hands while I use it, it’ll fall 4-1/2 feet to whatever surface I am standing on. That could be grass or a trail with pebbles and rocks on it, or anything between. And a user may be taller or shorter than I am. So one might conclude that if the device can reliably survive repeated falls from 5 feet to a reasonably unforgiving surface, such as concrete, we’d be safe. The device could, of course, fall so that its display hits a pointy rock. Or it could slip out of your hands while you’re standing on a ladder. But those are exceptions.

So now we check the MIL-STD-810G on how to conduct a test where a device is dropped from five feet to concrete, and, surprise, there is such a test. If the device passes that test, described in Method 516.6, Procedure IV, all is (reasonably) well.

But do consider that MIL-STD-810G Method 516.6 alone consists of 60 pages of math, physics, statistics, graphs, rules, descriptions, suggestions and rather technical language. Simply claiming “MIL-STD-810G” is not nearly enough. The device specs must describe the summary result, and supporting documentation should describe the reasoning, specifics, and sign-offs.

Now let’s look at another example. Sealing. How well can a device keep liquids from getting inside (which is almost always fatal to electronics)? Since rugged handhelds are used outdoors, they obviously must be able to handle rain. But that’s not enough. Working around water means that eventually it will fall into water. What degree of protection is reasonable? Given that most consumer phones are now considered waterproof, rugged handhelds should be as well. The question then becomes what degree of immersion the device can handle, and for how long.

Here again, much is common sense. No one expects a device to fall off a boat into a deep lake and survive that. But if it falls into a puddle or a shallow stream, it should be able to handle that. The most often used measure of sealing performance is IEC standard 60529, the IP Code. IP67, for example, means a device is totally sealed against dust, and can also survive full immersion to about three feet. Unfortunately, the IP rating is imperfect. Several liquid protection levels are qualified with a “limited ingress permitted.” Electronics cannot handle any degree of ingress. Amount of liquid, pressure of liquid, and time of exposure all matter. And what happens if a protective plug is not seated properly? So here again, specs should include the summary, with the exact testing procedure in supporting documentation.

I will not go through every single environmental threat, as the approach is always the same: What is the device likely to encounter? How is it protected against that threat? How was resistance tested? Who tested it? And what was the result?

Ruggedness testing is about common sense. If it is likely that a device will be used in an unpressurized aircraft, how high will that aircraft fly? Determine that and then test operation under that pressure.

If a device is likely to be used in very cold or very hot climates or conditions, determine how hot and cold it might get, then test whether it will work at those temperatures, for how long, and without an unreasonable drop in performance.

Test procedures for most environmental conditions can be found in the MIL-STD-810G or some other pertaining standards. I say “most” because some are not included. For example, I’ve often wondered why some rugged devices use shiny, gleaming materials that are certain to get scratched and dented on first impact. It will not affect performance, but no one likes their costly rugged device to be all scratched up after a week on the job. Scratch and dent resistance should be part of ruggedness testing.

The big picture is that serious, documented ruggedness testing, based on common sense and tailored for the device and application at hand, matters. The ability to hold up on the job is what sets rugged handhelds apart. Testing that ability is an integral part of the product, one that benefits vendors and customers alike. -- Conrad H. Blickenstorfer, Ph.D., RuggedPCReview.com

Posted by conradb212 at 7:18 PM

June 5, 2018

Make IP67 the minimum standard for rugged handhelds

Few outside of the rugged computing and perhaps a couple other sectors know what an "IP" rating means, or the specific significance of "IP67." Those inside those markets, however, cherish the designation for what it is — a degree of protection that brings peace of mind.

Really? Well, yes. If an electronic device is "IP67-sealed," that means neither dust nor water will get in. Specifically, there's "ingress protection" designed to keep solids and liquids out. Of the two, dust and water, it's primarily water that most are concerned about. Water as in rain, splashing, spilled drinks, hosing down, or going under. In essence, anything that is used outdoors is in more or less constant danger of coming in contact with water, and since water is unpredictable, one never quite knows how much water that will be, or where it'll be coming from. What we do know is that we'd rather not have water damage or destroy our expensive and necessary (and often irreplaceable) electronic gear.

Why is there a rating system? Because most sealing and protection isn't binary. It isn't either on or off. Sealing and protection come in degrees. Sealing a piece of electronic equipment means extra work, extra cost, and often some extra hassles, like more bulk and weight or less convenient operation. So sealing and protection should have minimal impact on cost and operation, and that is why they come on a sliding scale.

Which makes total sense. If a certain type of device will never be used outdoors it makes no sense to spend extra effort to make it dust and waterproof. If it'll never leave a vehicle, it doesn't really need to be protected against full immersion.

But if there is a chance that dust and liquids may be an issue, it still may make sense to consider the degree of protection needed. A laptop, for example, may get rained on, but it's not very likely to get dropped into a pond and few will want to hose it down with a high-pressure jet of water. It may be a different story for a vehicle mount designed to be used on tractors, scrapers, or bulldozers. Those are routinely hosed down, and if the panel computer mounted on them can be hosed down along with the rest of the vehicle, all the better.

What is most likely to get rained on or drop into a puddle or worse? That would be handhelds. Which means that handhelds should, by default, be sealed to IP67 specifications. IP67 means completely dustproof and also protected against full immersion in water down to about a meter, even if it should take half an hour to retrieve the device.

Why do I offer this suggestion? Because I've become aware of the considerable psychological impact of a device either being fully protected and it only offering some protection.

I've come to that conclusion based on many years' worth of rugged product testing and outdoor product photography. If a product is IP67-sealed, I know we don't have to baby it, we can sit in in a puddle or under a waterfall and it will be okay. If it's less than IP67, there is none of that confidence. It may or may not be damaged, and should water get in after all and destroy the product, then it's our fault because it was only designed for protect to some level.

And that makes all the difference in the world. It's the difference between you feeling you need to protect the product and keep it from harm, rather than being able to use it any which way you need to, without the extra stress of having to baby it.

For me, perhaps the biggest example of the huge difference between "sort of" protected and fully protected are products that I use every day of my life, no matter where I go. Those being my phone and my watch. In my case an iPhone and an Apple Watch. My early iPhones and my first Apple Watch were not waterproof, and I was keenly aware of that weakness, always. So I was hyper-aware of the devices never getting rained on or subjected to any contact with water. They probably could have taken it, but just the prospect of damage and then being lectured by some Apple "genius" that, hey, you didn't take care of your iPhone and the damage is not covered — not good.

Apple apparently realized that, and most other smartphone manufacturers did, too, and so iPhones and Apple Watches have been fully waterproof for some time now. The psychological impact is substantial. Both are really IP68-sealed, meaning they can handle even prolonged immersion. So water is simply no longer an issue.

I am sure the psychological impact of using ruggedized, but only partially protected, handhelds on the job is even greater. They are expensive, they must not fail on the job, and they can't always be babied. And accidents can and will happen.

Which, to me, means that all rugged handhelds that will be used outdoors should be IP67 at least. There may be devices so complex that this is not easily doable. But with most it is doable. The technology is there. So let's get this done whenever possible. IP67 should be the minimum standard for any industrial handheld that may be used outdoors or near liquids.

Posted by conradb212 at 5:46 PM

February 24, 2018

A look at Apple's HomePod

by David MacNeill

I am the target market for HomePod. It’s an all-Apple house, subscribed to Apple Music and iCloud Drive — we’re all in. You can AirPlay other music services and devices to HomePod, but it’s really designed for fanboys like me. Can I order corn flakes from Amazon with it? No. Would I ever want to do that? No.

As a speaker, it is not overly expensive. The sky’s the limit for audiophile-grade speakers so $349 is not outrageous. One aspect that no one has raised in the reviews is that you don’t need a pair of them to fill a room. Pro audio studio monitors like my Yamaha HS system ($400 for the pair plus another $400 for the matching subwoofer/hub) and computer speakers like my Mackies ($100) are always sold in pairs. You only need one HomePod per room. I say it’s the best home sound you can get for the money.

Siri control is very well done — you can talk to it from across the room and it just works, even if music is playing. There’s room for AI improvement, certainly. I asked Siri to play KBSU (local NPR news station) which it streamed perfectly, but when I asked “her” to play KBSX (sister classical station) she couldn’t do do it, saying it could not alter my cue while it is playing. I could not find a voice command to clear the cue. This only happens when streaming internet radio stations, not with music playlists — bug!

These are mere software issues, of course — the hardware is brilliant. We have entered the era of computational audio. Concepts like “stereo” and “surround 5.1” are obsolete. HomePod knows where it is, where the walls are, where the reflective surfaces are that might echo or create standing waves (the muddy sound you get when you place a speaker in a corner). It beamforms the center channel to be in the middle of your room, while ambient sounds mixed left and right go where you’d expect — it’s uncanny. This is the most three-dimensional speaker I’ve ever heard. There is no sweet spot, it’s just loud and clear and everywhere. And if you move HomePod, it recalibrates in seconds. HomePod has iPhone 6-level computing power making it the smartest in its class by a huge margin. That’s a lot of CPU potential — voice apps, anyone?

HomePod is Apple at its best. The thing does exactly what it says it will do, then gives you more than you expected. Best of all, Apple has zero interest in monetizing what you say to your HomePod. Nothing is stored, everything is encrypted end to end. It lives to serve and is totally trustworthy.

Posted by conradb212 at 8:39 PM

September 13, 2017

The impact of iPhones on the rugged handheld market

Apple has been selling well over 200 million iPhones annually for the past several years. This affects the rugged handheld market both directly and indirectly. On the positive side, the iPhone brought universal acceptance of smartphones. That accelerated acceptance of handheld computing platforms in numerous industries and opened new applications and markets to makers of rugged handhelds. On the not so positive side, many of those sales opportunities didn't go to providers of rugged handhelds. Instead, they were filled by standard iPhones. There are many examples where aging rugged handhelds were replaced by iPhones, sometimes by the tens of thousands. That happened despite the relatively high cost of iPhones and despite their inherent fragility.

By now it's probably fair to say that the rugged handheld industry has only peripherally benefitted from the vast opportunity created by the iPhone's paving the way for handheld computers. Why did this happen? Why did it happen despite the fact that iPhones usually don't survive a single drop to the ground without damage, despite that fact that only recently have iPhones become spill-resistant, despite the fact that iPhones need a bulky case to survive on the job, and despite the fact that their very design -- fragile, slender, gleaming -- is singularly unsuited for work on the shop floor and in the field?

One reason, of course, is that Apple is Apple. iPhones are very high quality devices with state-of-the-art technology. Apple has universal reach, excellent marketing and packaging, and massive software developer support. And despite their price, iPhones are generally less expensive than most vertical market rugged handhelds. Another reason is that creating a rugged handheld that comes even close to the capabilities of a modern consumer smartphone is almost impossible. Compared to even the larger rugged handheld manufacturers, Apple is simply operating on another level. The combined annual sales of all makers of rugged handhelds, tablets and laptops combined is only about what Apple alone sells in just over a week.

All that said, what can the rugged handheld market learn from Apple? Actually quite a bit.

Take cameras for example. The iPhone has almost singlehandedly obliterated the market for consumer cameras. People take pictures with their iPhones today, not with dedicated cameras. That's not only because it's convenient, it's also because iPhone cameras are quite excellent. They are state-of-the-art and have massive software support. Sending pictures from an iPhone to another phone, a computer, social media or anywhere else is easy. iPhone pictures and videos can easily be viewed on a big screen TV via AirPlay.

How important are cameras in smartphones? Important enough that the iPhone 7 and 7 Plus became hugely successful despite only modest overall improvements over the prior iPhone 6 and 6 Plus. The iPhone 7 Plus came with two cameras that seamlessly worked together, there was an amazingly successful "portrait" mode, and overall picture taking with an iPhone 7 Plus became so good that here at RuggedPCReview.com we switched all product photography and video from conventional dedicated cameras to the iPhone 7 Plus.

The same is again happening with the new iPhone 8 and 8 Plus. Relatively minor improvements overall, but another big step forward with the cameras. Both of the iPhone 8 Plus cameras have optical image stabilization, less noise, augmented reality features thanks to innovative use of sensors, stunning portrait lighting that can be used in many ways, and also 4k video at 60 frames per second, on top of full HD 1920x1080 video in 240 frame per second slow motion. And the new iPhone X ups the ante even more with face ID that adds IR camera capabilities and sophisticated 3D processing.

Is this all just for fun and play? Oh no, it goes way beyond that. Imaging has become essential in today's world, both on the personal and the business and enterprise level, and beyond. People document and record just about anything with their smartphones today, and that's made possible both by superior quality of those smartphone cameras as well as by the sheer convenience of it. People take pictures of anything and everything today. Good, useful, high quality pictures and video.

Can that be done with rugged handhelds as well? After all, the ability to quickly and easily document things on the job is becoming massively important. Sadly, the answer is no. Yes, compared to the truly dreadful cameras that used to be built into rugged mobile computing gear, the situation has improved substantially. But overall, even the best cameras in rugged gear lag way behind almost any modern smartphone. And the average cameras in rugged gear are way, way, way behind anything on a phone.

Just how critical is the situation? Very. A shocking number of mobile Windows devices simply use Microsoft's basic camera app that has virtually no features at all and is very rarely matched to the camera hardware. That means you get blurry low-res pictures from camera hardware that seem, according to their specs, capable of significantly more. More often than not, there are barely any settings at all, no white balance, no manual adjustments, no smart focussing, image aspect ratios do not match display aspect rations, there's huge shutter lag in stills and deadly slow frame rates in video. As a result, even the best integrated documentation cameras are barely good enough to generate passable imagery even after training. That is a very regrettable oversight and it's really a shame.

Then there's the hardware. Here again the iPhone, and most consumer phones, are lightyears ahead of just about anything available in vertical markets. No one expects low-volume vertical market gear to match the likes of Apple and Samsung in advanced miniaturized technology, but the gap really should be much closer than it is. Especially given that even consumer electronics leaders don't push the envelope when it's not really needed. For example, the last three generations of iPhones have all used the same 12mp imager size, but still been able to push camera capabilities forward with every generation. Display resolutions, likewise, haven't changed from iPhone 6 to 7 and now 8. But that's because even the iPhone 6 Plus of three years ago already had a full HD 1920 x 1080 5.5 inch screen. Rugged handhelds, on the other hand, generally offer much lower resolution.

There's no need for specialized vertical market technology to always be at the consumer technology state-of-the-art. But it must not be so far behind as to impact the usefulness of the products.

With the new iPhone X, Apple again pushes the envelope. The new phone uses an OLED screen instead of a standard LCD. OLED (organic light-emitting diod) screens don't need backlights because the individual pixels emit light. That makes for deeper blacks and more vibrant color. The screen measures 5.8 inches diagonally, but since it covers the entire front of the device, the footprint of the iPhone X is much smaller than that of the iPhone 8 Plus with its 5.5-inch screen. Resolution is 2436 x 1175 pixels, which makes for a razor-sharp 458 pixels per inch.

The iPhone X is expensive ($999 for a 64GB version and $1,149 for a 256GB model) and will likely be a standard bearer rather than Apple's volume leader for some time to come. But its advanced technology will impact expectation and make lesser technology look old. And that can make or break sales. So by introducing their latest iPhones, Apple not only enforced the already high expectations customers have of a modern handheld with the refined iPhone 8 and 8 Plus, but boosted expectations with the iPhone X whose technologies will soon be considered the norm.

What does all that mean to the rugged handheld market? Most likely that things will get tougher yet. Is there still merit in building handhelds rugged from the inside out? Definitely. But waterproof consumer smartphones and the availability of some really good protective cases have raised the ruggedness ante. Is being good enough technologywise still, well, good enough? Maybe, but probably not. Consumers are smart and their expectations increasingly dictate what makes the cut and what doesn't, even at work.

And that is the impact of the new iPhones on rugged handhelds. Everything looks a bit older yet, and expectations have been raised yet again.

So what are my recommendations to the rugged handheld industry?

The good news is that the iPhone is still your friend. It continues to open markets and applications to handheld computers where none were ever considered before. And some of those new markets and applications may go to rugged gear. So the opportunity is still there, and it's perhaps greater than ever.

To realize that opportunity, however, some things need to happen. The rugged handheld industry must:

-- Adopt contemporary technology. It doesn't have to be state-of-the-art, but its must be close enough so as not be viewed as simply old.

-- Increase display resolution. One cannot claim to have professional tools when the professionals who use those tools expect, and need, more.

-- Pay attention to performance. A lot of the generic chips we see in use today are barely up to the job.

-- Do not disregard consumer tastes. If certain form factors, materials and sizes are appealing, use them.

-- Remember that there's one reason why customers pay more for rugged gear: ruggedness. Make sure it truly is. Have all external ruggedness testing done and available. Excel in this. If a consumer phone offers IP67, IP65 is no longer good enough.

-- Make ruggedness testing count. Explain what it means and how it benefits customers. No more generic "MIL-STD-810G compliant" claims without specifics.

-- Don't phone it in. Great cameras are part of the success of consumer phones. Make them great in rugged handhelds, too. I consider that crucial.

-- Hang on to, and improve, traditional rugged gear strengths. Outdoor viewability is one. Rain and glove capability is another. Scratch resistance, industrial-grade scanners, strong backlights, etc., are others. Make all that even better.

-- Clean up the software. An Android or Windows home screen with games, social media and entertainment doesn't impress enterprise customers. Stress functionality and security instead.

-- Drop the hype: If there are are extra business or enterprise features, explain them clearly and outline their real world benefits and advantages.

So yes, the new iPhones have once again raised the bar, and thus made it that much more difficult for rugged handheld computers to measure up in many customers' eyes. But I am convinced that, if executed properly, there remains great opportunity for the rugged handheld industry. -- Conrad H. Blickenstorfer, September 2017

Posted by conradb212 at 8:44 PM

July 31, 2017

A future where quality is king -- A look at Zebra's 2017 Manufacturing Vision Study

On July 31st, 2017, Zebra Technologies Corporation published the results of their 2017 Manufacturing Vision Study on emerging trends that are shaping the future of industrial manufacturing. The broad result of the global study suggests that manufacturers are adopting IIoT (Industrial Internet of Things) and Industry 4.0 concepts to get better insights and information about their manufacturing process and, most importantly, to improve quality.

Why Zebra knows

Why should such a study come from Zebra? Isn't Zebra in the mobile printer business? They are, but not only that. Zebra's been around for almost half a century, having started as "Data Specialties Incorporated" back in 1969. They initially made a variety of electromechanical products but soon focused on labeling and ticketing systems. The name changed to Zebra in 1986. A dozen or so strategic acquisitions followed and by 2013 Zebra cracked a billion in sales. And then more than tripled that in one fell swoop by acquiring Motorola Solutions' enterprise business in 2014.

Why do I mention all that? Because Motorola Solutions' enterprise business mostly consisted of the former Symbol Technologies, a pioneer in retail and inventory management bar code scanning systems. In my capacity as Editor-in-Chief of Pen Computing Magazine I visited Symbol's headquarters on Long Island several times before its acquisition by Motorola in 2007. On each visit I was impressed with not only Symbol's scanning and mobile computing technology, but also by the importance they placed on exploring fresh ideas and innovative concepts to push the state-of-the-art in data collection to new levels and in new directions. And by how in tune they were with emerging trends and directions.

Despite Symbol going through rough times in the early 2000s, that innovative spirit never flagged, and that didn't change under Motorola nor under yet another big change when Zebra came along, and it's apparently still there. I don't know how many of the same people are still around, but that spirit of innovation, of dreaming up new concepts and uses, of trying new things, of viewing great hardware as just one part of making business better, that's still there. And it's at the heart of those vision studies that Zebra's been generating.

Vision studies

These vision studies are, truth be told, not super-easy to read and comprehend. On the surface, the studies are surveys of how customers use technology, how they view new developments, and what their plans are for the future.

But Zebra also provides considerable value-added by presenting the survey results with commentary and in the context of emerging trends and directions. That includes not only the obvious, the Internet of Things, but market-specifics, like the Industrial Internet of Things (IIoT), and Industry 4.0 where intelligent networked systems go beyond mere machine control via feedback systems, interdisciplinary cooperation, IoT technology, and advanced resource/demand management to morph into "intelligent" or at least "smart" factories.

Zebra doesn't even stop there. The value-added of the vision studies also includes relating their survey findings to emerging trends, mindsets, and revelations, and how technology can get customers from here to there.

In this latest vision study entitled "Quality Drives a Smarter Plant Floor: 2017 Manufacturing Vision Study," Zebra highlights a major shift and transformation, that from cost-cutting no matter what to quality first.

Exclusive focus on ROI is out. Focus on quality is in

In essence, what's happening is that almost everyone is realizing that cost-cutting in an effort to boost short-term return on investment has been a disaster. That's because while it's certainly a good idea to eliminate waste, far too often cost-cutting has led to loss of quality. The whole concept of quality is extremely multifaceted, but it's almost inevitable that cutting cost just to please investors is quite short-sighted and will almost inevitably lead to lowered quality, and lowered quality will inevitably frustrate and anger customers, no matter how loudly they asked for low prices.

The result of realizing the dangers of exclusive focus on cost cutting to improve ROI is that quality is king again. But wait, don't you inevitably get what you pay for? If manufacturers used to cut costs to remain profitable even as quality suffered, won't a new emphasis on quality increase costs and raise prices?

The price of quality would indeed be higher prices if manufacturers continued business as usual. But Zebra's survey of 1,100 executives from automotive, high tech, food, beverage, tobacco and pharmaceutical companies showed considerable optimism about the positive impact of technology on both quality and revenue. The number of fully-connected smart factories will double in the next five years, the use of emerging technologies will rise, and respondents felt very positive on the impact of technology and automation. And Zebra cites a study by the American Society for Quality (ASQ) that claims that for every dollar spent on a quality monitoring system, companies could expect to see an additional $6 in revenue, a $16 reduction in costs and a $3 increase in profits.

How to improve quality and ROI

But how, exactly, do they expect to get from here to there? From the dead-end of the cutting costs/lowered quality cycle to solid quality and growth?

In essence by using emerging technology, IIoT and Industry 4.0 concepts to keep better track of both the supply chain and of assembly lines.

As is, Zebra points out that there's no real-time communication between supply chain and production lines, resulting in too little inventory (slowing production) or too much of it (increasing costs).

And while 27% of those surveyed by Zebra are collecting data from production lines, supply chains and workers, that data sits in some database or spreadsheet instead of being shared, leaving the true value of the intelligence untapped. In addition, tracking points are few and far between.

While most manufacturers already track production, real-time monitoring is usually limited to just a few checkpoints, or so called "gates." Visibility over the entire supply and production process can be vastly improved by adding more connected gates that use auto ID technology to provide real-time location, material allocation and asset condition at every critical juncture. This data can then be used to eliminate bottlenecks, communicate with suppliers, optimize just-in-time shipment, and ensure quality. And additional gates can help meet the ever more important demand fro increased product variants.

Towards quality and visibility

What are the components needed to make all of this happen? There's "the Cloud," of course, with its various levels and services, ranging from simple sensors and gateways all the way up to the various enterprise IoT offerings with all of their services and capabilities.

What Zebra contributes here is the subset of IoT that the company calls "Enterprise Asset Intelligence." That includes the company's impressive roster of wearable and mobile technology, various methods of data capture and identification, voice recognition, smart IDs, as well as enabling hardware/software solutions such as Zebra's SmartLens for Retail or a variety of purpose-built manufacturing solutions that cover every aspect of the operation.

Tracking assets from start to end is key to optimal, consistent quality. That makes real-time location systems (RTLS) in the manufacturing environment an interesting and mandatory proposition. Collected data along every step of the production process can be used not only for quality control purposes. It can also be used to communicate directly from the factory floor to suppliers so they can quickly react, keeping supply and inventory at optimal levels. Zebra calls that "Sense, Analyze and Act."

Moving past strategy and into deployment

What it all amounts to is that with its 2017 Manufacturing Vision Study, Zebra provides not only information how over a thousand manufacturing industry executives see the current and future use of data capture technology of their operations. Zebra also offers a blueprint of how to move from a strictly ROI model to embarking on a future where simple cost-cutting is replaced by greatly enhanced visibility over both supply chains and production lines via a finer mesh of intelligent gates, optimizing both quality and cost in the process. Zebra feels that it is ahead of the curve in this, and one of the few players than has moved past strategy into real world deployments that will have profound effects on all their customers in the coming years.


Zebra's 2017 Manufacturing Vision Study

Zebra Wearables on RuggedPCReview: WT6000
Zebra Wearables on RuggedPCReview: WT41N0
Zebra Touch Computers on RuggedPCReviewr: TC5
Zebra Touch Computers on RuggedPCReview: TC70x
Zebra Touch Computers on RuggedPCReview: TC8000

Posted by conradb212 at 5:11 PM

July 27, 2017

GammaTech celebrates its 30th anniversary

GammaTech Computer Corporation (now called Durabook Americas) is celebrating its 30th anniversary this month, July 2017. That's amazing longevity in an industry where big names come and go. And it marks GammaTech as one of the pioneers in an industry and technology that truly changed the world.

It's hard to believe that it's been 40 years since the Apple II rang in the era of personal computers, seen first just as expensive toys for nerds and hobbyists, but then, four years later, legitimized by the IBM PC.

The 1980s were the Wild West era of personal computers. PC trade shows drew huge crowds. Trade magazines had many hundreds of pages every issue. Everyone wanted in on the action.

Taiwan early on established itself as a major player in the OEM industry, OEM standing for Original Equipment Manufacturer, companies that actually make the products sold by another company under a different name. One such company in Taiwan was Twinhead International Corporation, established in 1984. Initially a domestic maker of PCs and peripherals, Twinhead soon branched into global markets, setting up subsidiaries in the US, Germany, France, and the UK, as well as distributors in dozens of other countries.

Twinhead USA, now known as GammaTech, was the first such international branch and became instrumental in distributing a succession of interesting, innovative Twinhead Slimnote, Superlap, Supernote and Efio! brand laptops.

Times, however, changed. PCs became a commodity in a market increasingly dominated by a few large companies. Even IBM dropped out of the PC business, and other major players began concentrating on niche markets. Just as Panasonic launched the Toughbook and built that brand, Twinhead turned its focus to industrial and application-specific and mission-critical systems and launched the Durabook brand of rugged notebooks and tablets.

And when did the Durabook brand get started? It happened a little bit like with Panasonic whose rugged notebooks existed before the Toughbook name was introduced. Twinhead, likewise, launched their first military-grade rugged notebook, the N1400, around the turn of the millennium, but the Durabook brand itself began appearing in 2002.

In our 2003 Comdex coverage, we reported that "Twinhead announced their latest series of semi-rugged notebooks that may prove to be tough competition for Panasonic." We had ample opportunity to examine Twinhead quality and ingenuity firsthand in products the company created for and with longtime partner and customer Itronix. An example was the Itronix GoBook Duo-Touch where our reaction was "Wow. They really nailed it this time." And the Itronix GoBook II — which started life at Twinhead's manufacturing plants before they were sent for final customer configuration at Itronix in Washington — became Pen Computing Magazine's Editor's Choice in the "high performance rugged notenbook."

GammaTech displayed a show of strength again a couple of years later in the 2005 Pen Computing Magazine's Editor's Choice Awards where Durabooks battled Panasonic Toughbooks to a draw with two awards each.

That same year, RuggedPCReview reviewed the Durabook N14RA. In the review we mentioned Twinhead's initial Durabook "Slim, mobile, ruggedized, and affordable" slogan, and that the N14RA certainly fit that bill. We liked not only the design and toughness of the N14RA laptop, but also praised Twinhead for a "masterful job selecting materials and textures". We also explained how the company had very productive relationships with major vertical market leaders such as Itronix, and thus has gained substantial expertise in the design and manufacturing of durable, ruggedized, and fully rugged mobile computing equipment.

Over the years, we have examined and reported on GammaTech Durabooks dozens of times. Around 2010-2012 we found Twinhead's N-Series of magnesium-bodied notebook computers "very solid and trust-inspiring." We were impressed with the Durabook U12C and R13S convertible notebooks that provided the same functionality as the market leaders, but at a significantly lower price.

More recently, we applauded GammaTech making available the DurabookK R8300, an updated version of the Twinhead-sourced General Dynamics Itronix GD8200. This was a terrific fully-rugged laptop that appeared to vanish when General Dynamics closed down Itronix. But GammaTech brought it back better than ever.

We did detailed reviews of GammaTech's various rugged tablets and were particularly impressed with the still available Durabook R11: "to say that the GammaTech Durabook R11 is impressive would be an understatement." GammaTech then followed up with an updated R11 tablet in conjunction with a well-conceived keyboard that made for a very useful hybrid 2-in-1 ("A well conceived, well executed solution without any of the flimsiness and stability issues of most add-on keyboards"):



Likewise, we were impressed with the updated Durabook SA14 semi-rugged laptop and called it "a good deal for anyone who needs a high-performance, highly configurable notebook that is significantly tougher than standard consumer laptops and should hold up well in daily use".


Overall, GammaTech could serve as a case study both of longevity based on exemplary quality and dedication, but also of the challenges of splitting business between OEM/ODM and one's own brands.

Given the company's experience in rugged markets and the remarkable quality and price points of its products, one would expect to find GammaTech among the rugged computing market leaders. But those spots are taken by Panasonic, with Getac and Dell battling for second place. The closing down of General Dynamics Itronix probably was a big blow for Twinhead. Another issue might have been product overlap as a result of OEM customers requesting similar but not identical models that could cause confusion when appearing under the Durabook name.

Now, celebrating its 30th birthday, GammaTech sports a trim, focused product lineup that includes the SA14 and S15AB semi-rugged laptops, the fully rugged R8300 laptop, the rugged R11 tablet that can also be used as a 2-in-1, the military-style 10-inch V10BB fixed-mount tablet/panel, and the multi-purpose 24-inch Durabook P24 workstation. Between the company's long established experience and track record, the Durabook brand equity, and a fine-tuned product roster that's not only technologically up-to-date but also attractively priced, the pieces are in place for GammaTech to set its sights on gaining marketshare.

So congratulations, GammaTech, on your 30th anniversary, and for contributing so much to making computers durable, tough and rugged enough to hold up out there on the job and in the field. And for keeping them affordable. Best of luck for the future. You have much going for yourselves.

Posted by conradb212 at 11:09 PM

June 16, 2017

Apple Watch Series 2 after three weeks

It's been three weeks since I finally gave in and bought a Series 2 Apple Watch. While the Apple Watch isn't a rugged device and thus not something we'd normally report on, it is a wearable device and wearable computing power is playing an increasingly prominent role. That's because unlike even handhelds, a wrist-mount is always there, always handy and it doesn't need to be stowed away.

So what's been my experience over the first three weeks with the watch? The good news: I am still wearing it. That could be because I wore wrist watches for decades (albeit not in a good many years) and am therefore used to the concept of wearing something on my wrist. It could also be that I found enough to like on the Apple Watch to keep wearing it. Most likely it's a combination of the two.

So here are some of my impressions, starting with the good:

Battery life — Battery life is better than I expected. With the way I've been using the watch I get at least two days of wearing it (including nights) and sometimes three. That's much less than a regular watch whose battery lasts years, but the Apple Watch likely has far more computing power than Apollo 13 had onboard when it went to the moon. And the Apple Watch charges quickly. An hour and a half and it's fully recharged.

Great watch — It makes a great watch. The Apple Watch can display an endless variety of watch faces. I use is a simple, elegant white-on-black analog one. The display is bright enough even in full sunshine. I configured the watch face so that it also displays the date/day of the week, remaining battery charge, weather, and the messages icon.

Messaging, weather and phone — I am surprised by how much I like having the weather info on the watch. I like knowing what the temperature is outside and whether it's sunny, cloudy or raining. I also like having my messaging app right on my wrist. This way I see incoming texts instantly, I can scroll through texts, and I can answer with quick canned responses or even brief messages (there's a letter recognizer function).

Knowing not only when I get a call or a text, but also being able to answer it right on the watch is great. Yes, that's what the iPhone is for, but given how large my iPhone 7 Plus is, I don't always have it on me. As long as my iPhone is within about 100 feet of the watch, there's a Bluetooth connection, and the watch has WiFi, too (albeit only the 2.4GHz band), so as long as my iPhone is on a network, the watch can be, too.

Using the watch as a phone works remarkably well. The sound is amazingly good. But whenever I use the watch as a phone I wonder why there isn't also a camera built into the phone.

Heart rate sensor — The iPhone's heart rate sensor is quite informative. For many years I thought my resting heart rate was around 60 and I really didn't know what it was when I was running. Thanks to a Sleep app on the Apple Watch I now know that my resting heart rate is more like 50. And thanks to my running app I now know my heart rate when I am exercising.

The sensors in the Apple Watch tell the watch how and when I am moving. That way the watch knows when to turn its display on for me to tell the time, and when I am laying down or sleeping. The Sleep app uses that information to show me whether and when I am restless during my sleep.

Onboard GPS — As stated, I bought the Apple Watch primarily because I didn't want to hold my iPhone in my hand or have it in a pocket when I go running. The original Apple Watch didn't have its own GPS and so relied on the iPhone for collecting positioning date. The Series 2 watch does have GPS and can record geographic information by itself. And since my preferred running app does have an Apple Watch app, I was looking forward to leaving the phone at home when running.

Initially, that didn't work. And I still haven't been able to get my favorite running app to reliably record runs. So I tried a different one (Nike's NRC+) and that one works just fine. The app records distance, speed, heart rate and route. Once a run is done, it passes the run info on to the iPhone. So that's what I have been using on my runs. I did, however, not like switching from my favorite running app one bit. Just as I wouldn't like to have to switch from my favorite wordprocessor to another one.

So that's what I have primarily been using the Apple Watch for. If that doesn't sound like much, a) I'll doubtlessly discover other cool things, and b) it's no different on my computer where 98% of the time I use the same four or five software applications.


But not everything has been great:

Battery life — Battery life, while better than expected, is still a limiting factor. With the Apple Watch, I now have yet another piece of electronics that I have to remember to charge. Which means another charger and another cable, and something else to lose or forget or damage.

No truly compelling apps — I am still looking for the "killer app" that would truly justify the Apple Watch and make it indispensable. In the past most people absolutely needed a watch to tell time. Now we use smartphones for that. It has to be something that the watch does inherently better than the phone, and I haven't found anything like that.

A couple of years ago, Apple said there were already 8,500 Apple Watch apps. Today that number is likely much higher. Unfortunately, the majority of watch apps I tried are really quite useless. Not all as some may address specific needs, like tidbits of information that are handy to have without reaching for the phone. But for the most part, whatever the watch can provide on its tiny screen, the iPhone does much better on its larger screen.

Clumsy interface — It doesn't help that the Apple Watch interface and controls are marginal. The mosaic of tiny round icons without any labeling is initially bewildering. I actually got used to it better than I expected, but it's definitely not a perfect solution, especially when one has a large number of apps on the watch. Apple itself seems to realize that as the next rev of the Watch OS will offer a simple alphabetical list of apps as an alternate.

As far as hardware controls go, the crown and pushbutton are far from self-explanatory. Use the push button to bring up the dock, then rotate the crown up and down to go left and right on the dock? Not great. Apps may require a tap, a double-tap or a hard push to do things, but you never know.

Disturbing Watch issue — Then there are other issues. After my wife recently returned from her vacation she found, much to her dismay, that the display of her own original Apple Watch had popped open! The battery inside was visibly swollen. Not good at all. Googling this disturbing situation revealed that bloating first-gen Apple Watch batteries are well documented, though 9to5Mac.com stated that the problem "appears to not be widespread or something that has made mainstream media headlines."

Apparently the issue is serious enough for Apple to extend the service coverage for swollen first gen Apple Watch batteries to three years. We made an appointment with the local Apple store where a rather disinterested and not very helpful "genius" said he'd only ever heard of one other such problem. Apple would replace my wife's watch. Given that her faith in the watch is shaken, my wife, who was visibly appalled at the poor Apple Store experience, asked if she could pay for an upgrade to a Series 2 watch. No can do, said the genius.

So the verdict on the Apple Watch remains inconclusive. Stay tuned.

Posted by conradb212 at 8:09 PM

May 30, 2017

Initial impressions of an Apple Watch holdout

So I finally got an Apple Watch. Series 2, the one that's waterproof and has onboard GPS. I chose the larger one with the 42mm screen. When you need reading glasses every millimeter counts. That meant a $400 investment for a bottom-of-the-line watch with the aluminum case (as compared to the much more expensive stainless steel and ceramic ones). I picked space-gray with a black Nike sport band.

What took me so long? I am not sure. As a native Swiss I love watches and there were times in my life where I collected them. And I love Apple. When Apple has something new, I almost always buy it right away. So why not the Apple Watch when it first came out?

Overall, a funny thing happened to watches. When I grew up and for most of my earlier life, everyone wore a watch. A watch was part of life. But that has changed. I see comparatively few people wearing watches today. And if they do, the watch seems more a fashion accessory or a status symbol than something that tells time, which is really all that watches do. Fact is, smartphones have replaced watches. And cars and almost any other gizmo. They all tell time. Sure, it takes a bit longer to reach for the smartphone than to simply glance at your wrist to tell time, but we have gotten used to it.

That still doesn't explain why I didn't get an Apple Watch when it first came out. Between loving watches, loving Apple and loving technology, I should have but I didn't. And it's not that I have some inherent issues with watches that tell more than time. I bought the Microsoft watch almost a quarter of a century ago. I had numerous Casio and Timex watches that did all sorts of things.

So why not the Apple Watch? I am not quite sure. I recall being underwhelmed after watching Apple's initial introduction of the watch. It looked sort of blah instead of new and supercool. It couldn't really do anything that I felt I really needed. No Dick Tracy functionality there. And not even the promise of video calls that I'd seen in a Panasonic concept 15 years ago (see image to the right). Worst of all, the Apple Watch needed the iPhone to do much of anything. Why get something that makes you carry something else around? So I passed.

My wife finally got a Series 1 Apple Watch about a year after its April 2015 introduction. She was excited when she found that she could use the Apple Watch as a phone around the house. That meant there was no need to always have the iPhone with her, or trying to remember where it was. But using the Apple Watch as a phone quickly drains the battery. Apple claims up to 3 hours talk time. My wife is on the phone quite a bit, and the Apple Watch battery often didn't even make it past noon. She concluded that apart from being a conversation piece, the Apple Watch really couldn't do anything for her and so she stopped wearing it.

Then Apple introduced the Series 2 of the watch. Now the Apple Watch was waterproof, twice as bright, and had onboard GPS. That caught my attention. Waterproof to 50 meters (164 feet) and a brighter screen made it suitable for outdoor adventures. And onboard GPS meant you wouldn't have to take your iPhone along when jogging or running.

Waterproof to 50 meters ****

My interest cooled significantly when I found that Apple's idea of waterproof to 50 meters is very different from mine. To me it meant theoretically being able to take the watch on scuba, even past the recreational limit of 40 meters. To Apple it means the watch "may be used for shallow-water activities like swimming in a pool or ocean. It is also safe to wear it while showering or in a hot tub. However, Apple Watch Series 2 should not be used for scuba diving, waterskiing, or other activities involving high-velocity water or submersion below shallow depth." Truth be told, just like I never stuck my expensive iPhone in a waterproof case to take pictures and video while scuba diving, I wouldn't take a $400 Apple Watch underwater. But Apple's bombastic claim just rubbed me the wrong way.

So that left the onboard GPS. And that was what, in a spur-of-the-moment decision, made me finally get an Apple Watch. Getting a jogging GPS watch had been brewing in my mind for a while because I am a runner. Doing a good, exhausting run three times a week works for me and I've been doing it for many years. On top, I frequently run 5Ks. I keep track of my runs and performance with a dedicated running app on my iPhone. And have been holding the big iPhone 7 Plus in my right hand on many hundreds of runs. Not optimal.

Awesome packaging

Getting any Apple product is exciting. Apple knows how to package and present. The black box the Apple Watch comes in is over a foot long and, empty, weighs a pound. That's a bit more than an iPad Air 2. Just for the box. And the Apple Watch is really quite attractive in a minimalistic way, sort of like a shrunken iPhone. The semi-matte anodized space-gray finish looked terrific. Instructions are minimal, too, as they are with all Apple products. After all, Apple products are so simple to use that instructions aren't needed.

Except that with the Apple Watch they are. Needed, that is.

That's because trying to figure out the relationship between the iPhone and the Apple Watch is more like working with iTunes (widely known as Apple's most cryptic, befuddling and recalcitrant piece of software).

This Apple product does need instructions

Apple's been working to portray the watch as more than just an adjunct to the iPhone. Onboard GPS and more apps that don't need the iPhone to run are a good start, but fact is that the Apple Watch remains almost fully dependent on the mothership iPhone for almost everything that goes beyond just being a watch.

That means becoming very familiar with the iPhone's Watch app that's used to set and configure and manage the watch. In essence, the iPhone acts like a server to the Apple Watch, or like the Cloud to a Chromebook. The watch can do some things, but it doesn't have its own internet access, and what can one do these days without internet access?

Unfortunately, very little is obvious on the Apple Watch and the watch-related software on the iPhone. It does start innocuous enough, with browsing various watch faces which, on top of basic watch functionality, can accommodate up to four additional functions, like showing battery charge, date, weather or icons to bring up favorite apps.

Complications???

What comes next is more complex and getting it right will make or break how useful the Apple Watch becomes.

It starts with "Complications". In Apple Watch lingo "complications" doesn't mean difficulties or being confronted with unexpected problems. Instead, the word refers to additional functions that can be shown on the watch face. Apparently, Apple chose the word because old-style Swiss watchmakers used it to describe anything that made the basic clockwork of one of their masterpieces more complicated. Using it on the Apple Watch is, well, absurd.

Next comes figuring out what sort of apps that aren't "complications" should also run on your Apple Watch. As is, some but not all of the apps resident on your iPhone have Apple Watch apps, and those all show up as installable. If you are not super-careful, they will all be installed, which means you'll end up with hundreds of tiny icons on your Apple Watch. You won't know beforehand what those apps will do or what they look like. If you tap it on the iPhone's Watch app, it will be installed. No explanations. Tapping again uninstalls it, but this seems a very inadequate method of figuring out what should occupy valuable real estate on the Apple Watch's limited memory and even more limited display real estate.

But that's not all. For each of the apps you should then set what sort of notifications you want to receive on the Apple Watch. That's similar as on the iPhone where it's all too easy to be peppered with endless notifications all day long. Now THAT's a complication.

The iPhone Watch app also lets you play with the layout of all the icons that represent apps installed on the watch. This actually works a lot better than arranging icons on the iPhone itself. You can arrange the little watch icons any which way easily and we've seen some innovative layouts.

Finally, the Apple Watch also has a software dock. The dock serves the same purpose as the icon dock on a Mac or the fixed apps dock on the bottom of an iOS device: those are apps you want to access quickly without having to first locate them on some other screen or place.

As far as user interface goes, swiping left and right brings up different watch faces that can include different "complications." That can come in handy if you use the watch for different purposes. For example, you could have a workout watch layout that shows heart rate and workout data, one that's for travel with time zones and airline app "complications," or one that's pretty and shows pictures from a special folder.

Swiping up shows the Control Center where you can set airplane mode and other modes (silent, do not disturb, theater, ditching water from the speaker after swimming, and more). Swiping down shows notifications that have come in. Some apps have "force touch" that bring up app-specific controls if you press hard. Pushing the crown toggles between watch face and icons, and press and hold brings up Siri. Turning the crown scrolls. A side button brings up the app dock. There's no multi-touch, but with a bit of practice, the interface is fairly easy to remember.

What is the watch doing? What does it need? How? When? Where?

Where it gets difficult and often quite frustrating is what the watch doesn't tell you. Simply put, the Apple Watch does things, but it almost always doesn't tell you what and how. With no pop-up windows to tell you what tapping a function does, and with no room for explanatory text on the small Apple Watch screen, using the watch is often trial and error. It's not obvious if there's a connection with the mothership iPhone or not, whether there's GPS reception, what apps may be sucking power in the background, how to close down such apps, whether or not the heart rate sensor is on or not or what its sampling rate is. And so on and so on and so on.

The Apple Watch OS is said to be much simpler and more intuitive than it was in its original version. Problem is that simplicity is a two-sided sword. If it's done well, a system simply works and you don't have to worry about a thing. But if necessary information is withheld for simplicity's sake and you never know what the system does, it can become quite frustrating. And that is, to me, the Apple Watch's biggest shortcoming. Or one of them anyway, for there are a few others.

So far, the good:

I am still in the early phase of getting acquainted with the Apple Watch. I've had it less than a week. So let's start with what I like:

The Apple Watch is a very cool watch. The 302 ppi 1,000 nits OLED display is definitely "retina," very clear, very sharp, beautiful. The display is off when I am not looking at it, but comes on with just the slightest delay when I flick my wrist up to look at the watch. My favorite watch face is a very simple analog one. The four "complications" (it's difficult to refer to this and not use quotes...) I use are remaining battery, day and date, weather summary, and the message icon.

The battery life is better than I expected after my wife's experience with her watch. I know now that some apps eat up battery just as they do on an iPhone. Stay away from those or use them sparingly, and the Apple Watch can run two or three days between charges. And it charges quickly (less than two hours from empty to full).

Though there isn't anything just yet that I'd consider a "killer app," some apps do come in handy. Since the watch is always on me, I have time, weather, and some important notifications on me, always; I don't have to first look for the phone. When wondering what to wear or expect, a glance at the watch shows me the temperature and likely weather. I see messages right away when they come in, I can read them, respond, and even see pictures, albeit without being able to zoom in to see more detail.

I can see how the various workout and health apps and functions Apple built into the watch can come in handy. Likewise, several other apps have potential, though it will take time and some effort to figure which ones and how they are best used.

And the not so good:

Unfortunately, there's quite a bit that I don't like or that leaves me underwhelmed.

The overall impression I have for now is that the watch just doesn't know (yet) what it is and/or wants to be. In that it reminds me of the initial iPhone which did not have apps as we know them now, just some minimal onboard functionality and the ability to look at websites that had special iPhone screens. That quickly changed and today the iPhone is a very powerful multi-purpose computer that no longer needs its own mothership (iTunes on a Mac or PC) to function.

After almost two years on the market, it seems that the Apple Watch has not reached that stage yet. Yes, there are apps that run on the watch, but their functionality is minimal and it's NEVER clear what connection to the iPhone app they should or must have to do what. I haven't seen anything on the watch that comes close to being a killer app. The watch remans an adjunct to the iPhone. That limits the watch to being of potential interest only to a relatively small fraction of iPhone users who feel like they also need an Apple Watch.

You could well argue that the iPhone itself really doesn't have a killer app that absolutely everyone must have and that has changed life as we know it. It's the overall usefulness of the iPhone that is its killer app. The iPhone has long since stopped being a "phone." The phone aspect is just a legacy leftover that telcos can still milk for a service few still rely on, making old-style dial-up phone calls. But the iPhone itself, as the Android smartphones it caused, is a new and incredible tool that we can no longer do without. The Apple Watch, on the other hand, uses a legacy form factor, the wrist watch, to somehow extend the iPhone, which really does not need extending.

But what about the health apps?

But what about the health and exercise aspect that Apple is pushing? That's certainly laudable. The Apple Watch comes with apps that are closely synchronized wth the main health app on the iPhone. And since the watch can record the heart rate which the iPhone can't, and since the watch can record movement much better since it's always on the body, there's definitely more information to monitor and manage health related functions.

But for that to really work, you have to wear the watch all the time, make sure everything is set up just right, have the iPhone handy, and then figure out what it all means, as the health app is quite a voluminous beast.

Health is certainly good, and if the Apple Watch makes more people aware of their health and what they can do to lead a healthier lifestyle, then that is a noble and worthwhile thing for sure. But it's definitely not a killer app as it is of potential interest only for the fraction of the population who has an iPhone, and of those the fairly small subsection that also have an Apple Watch, and of that contingent only the relatively small number of watch owners who are into using all the health functions, and use them religiously (because otherwise they are meaningless).

Why I got the Apple Watch

Just to reiterate, even though I am an Apple fan, a gadget lover, a watch lover, and I report on technology for a living, it took me almost two years to get an Apple Watch. And the reason that made me finally do it is because I thought it'd be great to no longer need to take the big iPhone 7 Plus with me when I go running, and to also record heart rate data. That is what made me buy the Apple Watch. For everything else, I was just fine with just the iPhone.

So what has been my experience with the Apple Watch to record my running? I don't know yet. Because I have not been able to get my running app, the one I essentially bought the Apple Watch for, to record my runs. That's a major disappointment and frustration. I know that the innovative developer of the exercise software I use spent much time in creating an Apple Watch app to go with their iPhone app (unfortunately that also made the iPhone app progressively more complex and much slower).

Yet, I can't get the running app to work on my new Apple Watch. It will search for a GPS signal, but when the message finally goes away, I have no way of knowing if GPS is now locked in or not, since the watch doesn't have an indicator for that. When I push the Start button on the app, nothing obvious happens, and when I wake up the watch upon finishing the run, the app says Start again, having recorded nothing. Or perhaps it did, for a while.

Although I like my running app very much and it's probably among the most used apps on my iPhone, I googled for alternatives and read their reviews. That was sobering. Most reviews are unclear on whether or not the watch apps run independently on the Apple Watch, or if they need the iPhone. Which makes a huge difference. I downloaded a few to see how they might work for me. Unfortunately, many apps these days force you to set up an account with them before you ever get to launch the app. All I want is collect data for myself and store it on my device, folks!

Inconclusive

So I remain undecided on the Apple Watch. It has, so far, certainly not replaced my iPhone as my running companion, and at this point I am not sure it ever will. I have not found anything on the watch that I totally need, or even that seemed of compelling use. As a watch, it's quite nice and it is a conversation piece. An expensive one, of course, but then again, millions of people spend far more on watches that can do far less. -- Conrad H. Blickenstorfer, conradb212@gmail.com

Posted by conradb212 at 9:14 PM

May 23, 2017

History repeats itself: it's now the Surface Laptop

So the long awaited Microsoft Surface Pro 5 has finally been unveiled as the "new Surface Pro." In its media release, Microsoft calls it "the next generation of the iconic product line and the most versatile laptop on the planet. The new Surface Pro delivers the most performance and battery life in a laptop that is this thin, light and quiet."

So right off the bat, Microsoft makes it clear that it now considers the Surface Pro a laptop and no longer a tablet. And just for good measure, within the first two paragraphs of the May 23 press release, Microsoft doubles up with calling the Surface 5 Pro "the Surface Laptop," "the ultimate laptop," "making the classic laptop feel fresh again" and talking about "the most versatile laptop" and "powerhouse laptop."

The message is super-abundantly clear. Laptop. Not tablet. Laptop. With a bit of sarcasm one might ask what took Microsoft so long this time to abandon the tablet. After all, mobile computing historians recall how quickly Microsoft absorbed early tablet initiatives over a quarter of a century ago as soon as the pesky PenPoint platform had been defeated, and how Microsoft's own 2001 Tablet PC morphed into convertible notebooks before the Tablet PC was even officially launched.

What does it all mean? To bluntly put it on the table: for the third time Microsoft has realized that Windows simply isn't a tablet operating system, and, for the role Windows plays on many hundreds of millions of desktops, it will most likely never be. Let's face it: with all of their myriad of functions, features and tiny controls, the Windows power apps are not tablet material. As long as the Windows legacy lives, and that may be for quite some time, using touch and pens for an OS and software culture that was designed, from the very start, around the mouse will never work.

Eventually that may change, but that won't be anytime soon. It's been five full years since Windows 8 was launched, and with all of Microsoft's might, the vast majority of the scarce touch-centric apps remain basic and fairly useless. Even under the latest version of Windows 10, touch is just a thin veneer on top of Windows as it's always been.

But what of the remarkable success of the Surface tablets? Microsoft hasn't been officially bragging about that as making its own tablet hardware really put them in direct head-on competition with their hardware partners whose business fortunes much depend on selling hardware for Microsoft's software.

It's easy to see why Microsoft was tempted enough to make its own hardware to actually go ahead with it. A good part of Microsoft's Windows problems is that the software is expected to run on whatever PC hardware it's installed. That means literally billions of possible permutations of hardware components. Compared with Apple, which makes its own hardware for its own software and has total control over it, that's a daunting challenge.

So why was the Surface so remarkably successful? Because Microsoft, for once, controlled both hardware and software, and took advantage of that to make really good tablets. And also because of the remarkable success of the tablet form factor in general.

But Microsoft is not stupid and has most likely been painfully aware that there's just no way that, given the gigantic Windows legacy behemoth hanging around its neck, Windows will ever work very well on tablets. As long as users stay in the "tablet mode" of Windows 10, things work marginally well, but those apps are really only sort of an Android Light. So even Surface users will likely spend much of their time in desktop mode with the keyboard and a mouse attached.

So why not go the "it's not a bug, it's a feature" route and rather than admitting that, as is, Windows isn't a good match for tablets, simply make the tablet more like a real Windows machine, like a laptop. Like the ultimate, most versatile, most powerhouse laptop. The Surface Laptop. One that works like a real old-style laptop with a decent keyboard, but one that, thanks to advancing technology and miniaturization can also be used as a tablet if the user comes across some truly touch-centric tablet software.

So there.

All that said, the specs, of course, look great. Kaby Lake all the way (conveniently precludes Windows 7), nice big 12.3-inch 2736 x 1824 pixel screen (really the same as the Surface Pro 4), impressive projected battery life, but rather costly ($2,199 with 16GB RAM and a 512GB SSD).

Just one thing. Though the Surface is now a laptop, it doesn't come standard with a keyboard nor a pen. Those are extras. Is a laptop without a keyboard still a laptop?

Overall, it's a bit odd.

Posted by conradb212 at 7:56 PM

March 9, 2017

Are "mobile" sites really needed?

A few days ago I used one of the readily available website analysis tools to check RuggedPCReview.com. The resulting report gave me a stern "site not mobile-optimized" lecture.

"Mobile-optimized," of course, refers to the fact that sites on the one-size-fits-all world wide web are being viewed on a very wide range of devices with a very wide range of screen sizes. And it is certainly true that viewing a webpage on a 27-inch display is a very different experience from viewing it on a 4.7-inch smartphone.

That predicament was recognized early on, years and years ago, and for a while there were efforts to have alternate web experiences for small devices. Examples back in the 1990s were the NTT DoCoMo system in Japan, WAP (Wireless Application Protocol) as sort of a barebones browser, and numerous proprietary browser-type apps, none of which ever had a serious impact.

Most recently, "mobile-optimized" has come to mean web pages that automatically rearrange themselves when the default width of the page is wider (i.e., has more pixels) than the display it is viewed on. That's generally done with sort of a block system where blocks can sit side by side if there is enough room, or they stack themselves when there isn't. Or the blocks shrink in width, making everything taller and narrower. Which is just about the very opposite of good page layout. It's very difficult to both make a site look good and easy to read when it's done by shrinking and stacking and re-stacking blocks.

The bigger question is why it's even be necessary. Back in the day it might have made more sense. That's because most desktop and notebook screens were first 800 x 600 pixel, then 1024 x 768, and then gradually more, with 1366 x 768 and 1450 x 900 common. As a result, most websites were designed to fit into these standard formats. That was a problem for early mobile devices whose screens used the 240 x 320 pixel QVGA and at most the 480 x 640 VGA format for many years. To make matters worse, while most of those early mobile devices did have some sort of zoom features, the devices just weren't powerful enough to make that work.

Now look at the situation today. While, amazingly, desktops and notebooks continued for many years with the same old resolutions, handhelds made tremendous progress. Laptops, especially, continued on with coarse, grainy displays and didn't change until Apple came up with the "retina" display concept, i.e. pixels so small that the naked eye could no longer see them individually when looked at from a normal viewing distance. Desktop monitors, too, resisted change for many years. Even today, "Full HD" 1920 x 1080 is considered state-of-the-art on the desktop, though anyone who has ever worked on even a 24-inch screen with just "full HD" can attest that it's no fun. Even today, 4k displays remain very rare on the desktop, and even 2k displays are the exception.

Compare that sluggish progress to what's been happening on smartphones. Numerous smartphones now have 1920 x 1080 resolution, the same as most giant HDTVs. A good number offer 2k (2560 x 1440) screens, and there are even a few with 4k ultra KD (3840 x 2160) resolution. That's incredibly sharp on a small smartphone display!

What that means is that the average webpage very easily fits on most modern smartphones, and often with room to spare. Granted, while super-sharp, text and graphics and layout look tiny on even a big smartphone screen. That's what that wonderful pinching and zooming on capacitive touch screens is for! Combined with the superior performance of modern smartphones and small tablets, effortless zooming and panning around a webpage is a piece of cake. And, in my opinion, far, far preferable than having to deal with a dumbed-down "mobile-optimized" website that keeps tumbling and pinching its ugly blocks around. So there.

Is it easy to make even the most common and widespread technologies one-size-fits-all? It isn't, as Microsoft has learned the very hard way. But with the reality that most handheld devices have just as many (or more) pixels to work with as laptops and desktops, I see no reason to engage in needless "mobile-optimized" projects.

In fact, one of the nasty consequences of that rush to make those ugly stackable oddities is that we now have lots of corporate sites built for the (wrongly presumed) lowest common denominator. My bank, for example, now has a website that looks like it was designed for an early iPhone. It's one size-fits-all, and it looks forlorn, inefficient and ugly on a laptop or desktop, and super-inefficient to boot.

Progress isn't always progress.

Posted by conradb212 at 6:36 PM

February 22, 2017

In search of a prepaid, transferrable SIM

At RuggedPCReview.com, we often analyze and report on mobile computing devices with integrated WWAN (mobile broadband) capability and a SIM card slot. SIM (Subscriber Identity Module) cards are smart cards that contain a subscribers phone number and certain data. Initially just used for the early GSM (Global System for Mobile communication) networks, SIM cards are now also used by carriers that based their networks on the rival CDMA (Code Division Multiple Access) technology, like Sprint and Verizon. However, those vendors use SIM cards only for 4G LTE.

Anyway, it would be nice for us to be able to test device wireless data capabilities while we review them. We can't always ask vendors and manufacturers to get us test units with activated voice and data service, and we cannot, of course, set up cellular service for every such device that comes into our lab. So I wondered if there is an equivalent of prepaid phones and cards for SIM cards. That way, we could load a SIM with so and so many minutes or gigs of data and then simply insert it into a SIM card slot for testing. And then transfer it into another device for testing when the need arises.

Problem is, there is a difference between prepaid phone cards and SIMs. With a prepaid phone, you buy minutes of air talk time for a particular phone and the prepaid cards simply replenish the allocated minutes for that phone. SIMs, on the other hand, contain your phone number and contacts and such, which makes the SIM inherently portable. The phone number and data on the SIM goes with a certain account and certain network, and so, presumably, as long as you had minutes or a data allocation on a network, you ought to be able to transfer SIM cards between devices that have SIM card slots.

Online search yielded conflicting information as to whether there are prepaid SIMs and if so, whether they could be inserted into another device without having to go through the hassle of setting up service again for the new device.

So I went to WalMart. They had aisles and aisles of phones and prepaid phones and prepaid phone cards, but just a single "SIM kit." It was not apparent from the packaging how it would all work, how minutes or data would be purchased, or what it would cost. I asked the folks at the electronics counter, and they not had no idea.

I tried Best Buy next. Best Buy, too, has a very large number of phones from various vendors and for various networks. I explained our situation -- that we needed a prepaid SIM that we could move from device to device for testing of their communications -- and the answer was twofold. First, Best Buy only sells phones and carrier service, not cards and prepaid stuff. Second, such a thing as I needed did not exist.

This seemed unlikely to me. The only difference between a prepaid phone and a prepaid SIM would be that with a SIM you could take your phone number and contacts and other data from one phone to another; service on the carrier side would be the same: so and so many minutes or so and so much data is allocated to a particular phone number.

So I went to Walgreens. Lo and behold, they had three SIM card kits, each costing $10 for which you got a nano SIM card with adapters for use in devices that take the larger micro and standard SIM cards. Of the three, the GoPhone kit looked the best, with its packaging claiming to work both on iOS and Android devices, and also showing "refill and go" cards on its package. Further, GoPhone uses the AT&T network that we already use at RuggedPCReview.

Back at the office I browsed through the voluminous documentation that included a frightening amount of fine print, the kind leads to the modern problem of never really knowing what something actually ends up costing. Or what extra charges would be added for this, that, and the other thing. Yes, there were 18 pages of AT&T legal fine print. Annoying.

But I did follow the basic directions, stuck the tiny nano SIM card into the standard SIM adapter that our first test device required, and then used a WiFi connection to go to att.com/mygophone and set up an account. I was pleased to see that nothing but an email was required, name not mandatory. Which, given that there is no contract and it's all prepaid, seems appropriate. I quickly got an email back with a phone number associated with my new SIM, and a request to now log on and pay. That, of course, instantly required a name and the usual detailed personal information. So goodbye (relative) anonymity), and can't you guys just give me the full scoop right upfront?!

And it's really not just buying minutes and then using them until they are gone. An AT&T account is required and you have to sign up for a service and the service automatically renews every month, etc., etc. You CAN purchase refill cards and that, presumably, avoid using credit cards or automatic bank payments. So I'll look into the refill cards just for the sake of it. However, what I wanted most, a SIM that had so and so many non-expiring minutes on it, that I didn't get. Whatever you don't use you lose, and every month there's a new charge even if you didn't use the phone for even a single call. Boo!

Anyway, the device worked just fine. But after being done testing it, would I be able to transfer the SIM to another device with a different operating system? I tried that by going from an Android device to one using Windows 10 Mobile. It worked. No fuss no muss, the new device simply had service and worked just fine. These two devices, however, had virtually identical (or identical) hardware. What about putting the SIM into a different handheld or smartphone?

I tried that with another Android device of a different make and a different version of Android. And that one used the micro SIM format instead of the standard format. I found that popping the tiny nano IM card in and out of its fragile plastic adapter isn't something one wants to do very often. And also that most SIM card slots obviously weren't made for frequent insertion and removal of cards. They are very fragile.

But the device came right up. No problem. What is confusing these days is figuring out what uses data and what still uses minutes. Most phone plans have now been changed ot data plans, but the inexpensive phone-only plans still use minutes. And that's what the (relatively) inexpensive GoPhone plan does. So the phone wants to know if data use is permitted? No, I guess not. But it still shows data use as on, which greatly raises paranoia of stepping into one of the phone companies' many gotcha traps.

The phone, however, works. And so far I've been able to transfer the SIM from one unlocked device to another without any issues.

Posted by conradb212 at 9:03 PM

December 15, 2016

Mobile Operating Systems Crossroad?

Interesting situation with mobile operating systems in the industrial and enterprise space. For many years, Windows Mobile (later named Windows Embedded Handheld) was the OS of choice, but then Microsoft stranded it. The industry hung in there with the abandoned mini OS for a number of years, but then slowly began looking at Android as a replacement. But now, just as Android is gathering steam in industrial handheld markets, Microsoft finally introduced a Windows Mobile replacement in Windows 10 IoT Mobile Enterprise (what a name!). So major rugged and enterprise mobile hardware vendors are beginning to offer devices that support both of those operating systems (the latest is the Zebra TC70x.

Obviously this situation requires some detailed analysis. The rugged handheld devices market isn't huge, but according to VDC we're still talking several billion dollars, not exactly spare change. In many respects, Android has been its own worst enemy with version fragmentation and emphasis on consumer markets, but as of late the platform has made significant strides in becoming more acceptable to enterprise users. On the Redmond side, Microsoft's neglect of its erstwhile market-leading mobile platform and lack of serious follow-up on the final feasible version (6.5) may well have driven many customers to take a wait-and-see approach to any mobile version of Windows 10.

What does make Windows 10 IoT Mobile Enterprise interesting is the different approach Microsoft is taking with Windows 10. Instead of having different versions of Windows for different markets, this time Microsoft is using a "unified core" that's common to all versions of Windows. That doesn’t mean there’s just one Windows that runs on every type of device. That wouldn’t make sense given the great variety of devices and their variations of purpose, size and resources. But under Windows 10 there is a common core with each family of devices then adding suitable features to that core. In principle, that means that developers don't have to write different code for different Windows devices. If that turns out to be feasible, it is a major selling point for Windows 10 IoT Mobile Enterprise.

And it means that those in the mobile computing hardware business need to watch that situation very, very closely.

Posted by conradb212 at 4:08 PM

November 30, 2016

Sharp, clear web images

RuggedPCReview readers who view our site on a modern tablet, smartphone or high resolution monitor have probably noticed that many images on RuggedPCReview.com are noticeably crisper and sharper than those on the vast majority of websites. Why is that? It's because earlier in 2016 we started serving images in both standard and high resolution. That way, if your browser automatically detects that you're viewing a page on a device with a high resolution display, it'll serve a high resolution image. If you're viewing it on a standard display, it'll load a standard resolution picture. Let me explain.

While high and higher resolution is a huge deal in consumer tech and advertising (think 4k TV, megapixels in cameras, and smartphone resolution), all that ginormous resolution is totally useless when creating images for websites. For example, a 4:3 aspect ratio picture from a 12-megapixel smartphone camera has 4000 x 3000 pixels. That's more than fit on a giant 4K TV screen with its 3840 x 2160 pixels, let alone a "high definition" laptop screen with its measly 1920 x 1080 pixels. Now add to that the fact that most pictures on web pages are quite small. And that web pages need to load as quickly as possible. And that the image resolution standard for web page pictures really hasn't changed in almost a quarter of a century (it's still 72dpi or 96dpi).

What that means is that while cameras and TVs, and phones and even some monitors have wonderfully high resolution today, the pictures we're viewing on the vast majority of websites are presented in dismally low resolution. They are fuzzy and grainy and really a bit embarrassing (though one might argue they fit right in with those terrible Wordpress layouts so many of todays' websites use).

So what can be done? Well, we code our web content so that folks who view our site on a high-resolution device will get high-resolution pictures. Easy as that. Or rather, not quite as easy as that. Because while at RuggedPCReview.com we found a way to serve high-resolution pictures without slowing down page loading, figuring out how to do that wasn't easy. And it's extra work to generate, and code for, multiple versions of every image. But we figured our viewers are worth the extra effort, as are our sponsors, and the terrific technology invested in all that rugged technology we report on, analyze, and test.

So if you've noticed that RuggedPCReview.com is different not only due to the total absence of annoying pop-ups, lame click bait, and ads you must read before you get to the site, but also in the very sharp and clear images we present, you're viewing the site on a contemporary high-res display, and we made sure it's the best possible viewing experience for you.

But aren't there still some older, fuzzy images on the site? Yes, old content remains as it was. But any new content is now created with multi-resolution pictures, and we're hard at work updating old imagery as well.

Posted by conradb212 at 4:27 PM

October 31, 2016

Rocky (not Balboa) has left the building

Back in 2003 we approached the then-titans of the rugged notebook industry with this challenge: "Send us whatever you consider your best all-purpose rugged notebook computer for a roundup!" Who did we send that challenge to? You'd think Panasonic, Getac and Dell or GammaTech. Panasonic, yes, but back then the other two we chose for the shootout were Itronix and Amrel.

Panasonic, of course, had been making rugged notebooks since 1996, and Itronix, too. Itronix, which at time was a subsidiary of Telxon, had sent us one of their X-C 6000 Cross Country computers for review in the mid-90s. We wrote that "its bulldoglike ruggedness means you never had to worry about it." Back then, incidentally, we also reviewed the all-magnesium M3I PCMobile rugged laptop whose keyboard was removable, so the current trend towards 2-in-1s isn't something new (think 1994 Compaq Concerto).

Anyway, we reviewed the rugged Amrel Rocky II in 1998. We liked it and commented that "the unit's extraordinary ruggedness and clever, flexible sealing mean that it can be used just about anywhere." And actually introduced it in the article as "representative of the re-emerging class of rugged pen-enabled clamshell notebooks." That's right, re-emerging, back in 1998. Because there had been tough pen-enabled notebooks back in the early 1990s.

So that explains why it was Amrel and Itronix that were duking it out with Panasonic for the best rugged notebook in our 2003 shootout. Who won? It was pretty much a draw. We reported that "at decision time, what it may come down to are very specific requirements that only one of the three may be able to fulfill." We loved the Amrel's individually sealable connectors and its ultra-sealed keyboard. And the fact that it was the only competitor with zero flex. We loved the raw processing power of the Itronix GoBook. And we praised Panasonic for its "overall fit and finish that no one can match."

Getac was also already on the rugged scene at the time of our 2003 comparo. We'd examined their mid-range A320 in 2000, and the big A760 with its extra connectivity was also available. I should mention that the debate about what differentiates rugged and semi-rugged was as heated back then as it is today. With the possible exception that what used to be considered "rugged" a decade and a half ago would now be sold as "ultra-rugged" or "fully rugged," and what used to be "semi-rugged" is now routinely called "rugged."

Be that as it may, back in 2003 it was Panasonic, Itronix and Amrel we asked to step up to the plate. Panasonic, of course, went on to become the market leader in rugged notebooks, a position the company still holds on to today. Itronix was bought out by defense giant General Dynamics in 2006 or so, hung in there for a couple more years as General Dynamics-Itronix before GD shut them down.

Amrel, however, bravely soldiered on, even in the light of increasing competition from Getac and then Dell. While rarely at the technological forefront in terms of processors and ancillary technology, the company dutifully delivered very good rugged notebooks in the 13.3-inch, 15.1-inch and even 17-inch class, And that on top of also offering a full roster of 8-inch to 12-inch rugged tablets, and an equally impressive lineup of rugged handhelds that included Android-based models and special versions for biometric identification and such. Only a year ago, in late 2015, Amrel launched the ingeniously simple "Flexpedient" AT80 Android tablet that combined Ford Model T simplicity with modern technology and seemingly unlimited application potential.

We often wondered how Amrel managed to hang in there with modest advertising and only modest pursuit of review and PR opportunities. We figured it probably was because they very narrowly focussed their efforts on their target markets and had no interest in becoming known to a wider circle of potential customers. But they did hang in there, and we had occasional contact with Amrel and discussions of areas of interest to the rugged markets.

Where did Amrel come from in the first place? From the bits and pieces of available information, it seems that the company was formed by Edward Chen, who had once been a VP at Motorola and had also co-founded Crete Systems in Taiwan around 1990. Amrel and Crete Systems had a close relationship, with Crete apparently the ODM/OEM for most of Amrel's rugged computers. There were also AMREL's subsidiary, Bionetek Corporation, that developed an early cardiac diagnostic system, and Solarmer Energy that was into polymer solar cells. There further was MilDef that had partnered with Amrel since the mid-1990s, and there was German Roda that was launched in 1987. There was the MilDef Group that consisted of MilDef Systems in Sweden, MilDef Ltd in England, MilDef AS in Norway, and MilDef Crete in Taiwan. Everything related in one way or another.

Why mention all this? Because after all these many years, in September 2016 Amrel in Southern California suddenly quit the rugged computer business and sold its computer division to MilDef Group. The news release stated that "MilDef Inc. will carry on AMREL's legacy under its MilDef brand name and continue providing our customers exceptional service and support after the acquisition." So most of the Amrel Rocky handhelds, notebooks and tablets are now sold as MilDef products.

Since Amrel had such a long history in rugged computing we, of course, wondered what happened. That doesn't seem to be clear even to insiders. Our email inquiries to both Amrel and MilDef went unanswered or shed little light on what happened. We consider that regrettable. There's too much history in that quarter of a century of rugged Amrel computers to simply go away. And though MilDef has taken over the line, at a time where the rugged notebook market leaders are charging full-speed ahead, it'd be good to know who MilDef is and what the company's intentions are, especially on the US market. After all, we're talking about a product line that once had battled the leaders in the field to a draw. They should not just vanish into obscurity.

Posted by conradb212 at 10:54 PM

September 5, 2016

Intel introduces Kaby Lake, the 7th generation of Core processors

In August, Intel officially introduced the first few of its 7th generation Core processors, codenamed "Kaby Lake." That comes at a time where the news about PCs generally isn't very good, where Microsoft has a very hard time convincing users to switch to Windows 10, and where it's becoming increasingly more difficult for vertical market hardware manufacturers to keep up with Intel's rapid-fire release of new generations of high-end processors.

7th generation Kaby Lake also comes at a time where 4th generation "Haswell" processors are considered quite up-to-date in the mobile and industrial PC arena, 5th generation "Broadwell" makes customers wonder how it's better than Haswell, and 6th generation "Skylake" leaves them befuddled because, well, what happened to Broadwell? And the rather expensive (US$281 to US$393) new 7th generation chips also come at a time where customers balk at paying more than a hundred bucks for a tablet or more than a two or three hundred for a basic laptop.

So what is Intel thinking here? That they simply must follow "Moore's Law" which says that the number of transistors that fit on a given piece of chip real estate doubles every 18 month? Or that, like Disney, catering to a small clientele to whom price is not an issue is the profitable way to go? It's hard to say, especially since the generation game really hasn't been about meaningful increases in performance for a good while now.

That's certainly not to say that the new chips aren't better. They are. Intel loves to point out how many times faster new generations are per watt than older ones. And that's really getting closer to why all of this is happening. It's mostly about mobile. See, back in the day everyone knew that you just got an hour and a half max from a notebook before the battery ran out, and that was grudgingly accepted. But then came Steve Jobs with the iPad that ran 10 hours on a charge. And somehow that's what people came to expect.

On desktops, performance per watt hardly matters. You plug the PC in and it runs. Compared to heating and air conditioning, toasters, ovens, TVs and a houseful of light bulbs, whether a chip in a desktop runs at 17 watts or 35 watts or 85 watts hardly matters. But in mobile devices it does. Because Steve Jobs also decreed that they needed to be as slim as possible, so big, heavy batteries were out. It all became a matter of getting decent performance and long battery life. And that's one of the prime motivations behind all those new generations of Core processors.

Now combine that battery saver imperative with a quest to abide by Moore's "law" (which really just was a prediction) and — bingo — generation after generation of processors that each is a bit more efficient and a bit quicker.

How was it done? By coming up with a combination of all sorts of clever new power-saving techniques and by continuously shrinking the size of the transistors that are the basic building blocks of a microprocessors. To provide an idea of just how small things are getting inside a microprocessor, consider this:

A human hair is on average about 100 micrometers thick, a tenth of a millimeter or about 4/1000th of an inch. The 8080 processor that started the PC revolution in the late 1970s with early microcomputers like the MITS Altair was based on 6 micrometer lithography, or "process technology." Process technology is generally defined as "half the distance between identical features in an array." So the smallest distance between two transistors in an 8080 was 12 micrometers, or about an eighth of the thickness of a human hair.

Over the decades since then, process technology has been miniaturized again and again and again. Whereas with that old 8080 chip (which cost three or four bucks at the time) it was 6 micrometer, which is 6,000 nanometers, the 7th generation of Intel Core processors use 14 nanometer process technology. Down from 6,000 to 14. So whereas the old 8080 had about 6,000 transistors total, with 14 nanometer process technology, Intel can now fit over a billion transistors onto the same amount of chip real estate. And with the die size of your average 7th generation Core processor larger than that of the little old 8080, it's probably more like five billion transistors or more. The head spins just thinking about it.

The upshot of it all is that the hugely larger number of logic gates on a chip offer vastly greater computing performance which you'd think would require vastly more power. But thanks to the hugely smaller size of all those transistors, that's actually not the case. Between the tiny size and all those logic gates available to run ultra-sophisticated power-savings operations, the chips are both more powerful and use less energy.

Now that said, there appears to be a law of diminishing returns. It's a bit like video games where early on each new generation had much better graphics, but now things are leveling off. The visual difference between mediocre and acceptable is huge, the difference between very good and terrific much smaller, and the difference between super-terrific and insane smaller yet. Same with processor technologies.

As a result, the performance and efficiency increases we've seen in the benchmark testing we do here in the RuggedPCReview lab have been getting smaller and smaller. By and large, 5th generation Broadwell offered little more than 4th generation Haswell. And 6th generation Skylake didn't offer all that much over Broadwell. The last really meaningful step we've seen was when 4th generation Haswell essentially allowed switching mobile systems from the standard voltage to ultra-low voltage versions of chips for much better battery life (or a much smaller battery) at roughly the same performance. Yes, each new generation has tweaks and new/improved features here and there but, honestly, unless you really, really need those features, larger power gains are to be had via faster storage or a leaner OS.

So there. As is, as of late Summer 2016, there are now six 7th generation Kaby Lake Core processors, all mobile chips. Three are part of the hyper-efficient "Y" line with thermal design power of just 4.5 watts, and three of the only super-efficient "U" line with TDPs of 15 watts. The primary difference between the two lines is that the "Y" chips run at a very low default clock speed, but can perform at a much higher "turbo" clock speed as long as things don't get too hot, whereas the "U" chips have a higher default clock speed with less additional "turbo" headroom. Think of it like the difference between a car with a small, very efficient motor that can also reach very high performance with a big turbo, versus a vehicle with a larger, more powerful motor with just a bit of extra turbo kick.

In general, Intel has been using what they call a "tick-tock" system where generations alternate between "tick" (yet smaller process technology, but same microprocessor architecture) and "tock" (new microprocessor architecture). By that model, the 7th generation should have switched from 14nm to 10nm process technology, but it didn't and stayed at 14nm. Apparently it gets more and more difficult to shrink things beyond a certain level, and so Intel instead optimized the physical construction of those hyper-tiny transistors. That, they say, allows things to run a bit cooler and requires a bit less power, resulting in, according to Intel, a 12-19% performance gain, mostly through running the chips at a higher clock speed.

The architectures of both the cores and the graphics haven't really changed. But there are some additions that may be welcomed by certain users. For example, Kaby Lake has much better 4K video capability now, mostly in the hardware encoding/decoding areas. And a new implementation of Speed Shift lets the CPU control turbo frequency instead of the operating system, which means the chip could speed up much faster. We'll know more once we get to compare Kaby Lake performance and efficiency with that of the predecessor processor generations.

There's some disturbing news as well. Apparently, some discussions and agreements between Intel and Microsoft resulted in Kaby Lake not really supporting anything before Windows 10. We don't know if that means older versions of Windows simply would not run, or just that they wouldn't run well. Given that so far (early Sept. 2016), Windows 10 only has 23% of the desktop OS share, any restriction on using older versions of Windows on new chips seems both ham-fisted and heavy-handed.

For a detailed tech discussion of all things Kaby Lake, check AnandTech.com here.

Posted by conradb212 at 8:16 PM

August 29, 2016

Congrats to Xplore Technologies: 20 years of rugged tablets, and only rugged tablets

At the January 1997 Consumer Electronics Show in Las Vegas, I walked into the South Hall of the Las Vegas Convention Center on the lookout for something — anything — new and exciting in tablets or pen computers. Sure, Microsoft had announced Windows CE at the Fall Comdex in response to Apple’s Newton Message Pad and the emerging “Palm Economy,” and our bi-monthly Pen Computing Magazine was doing well. But, by and large, handhelds and tablets were very far removed from the booming world of desktop computers and laptops and printers and the latest of absolutely-must-have PC software.

But there, amidst all of the glitzy, glossy booths of mainstream computing was… an even glitzier and glossier booth by a company I had never heard of. They called themselves Xplore Technologies, and they were thinking big. There had, of course, been rugged computers before, but most were quite utilitarian and often looked a bit unfinished. Xplore’s Genesys, on the other hand, looked like something right at home on the Starship Enterprise. Cool industrial design, bold lines, even bolder plans. Having seen my share of grand plans I admired the effort but wasn’t convinced that these folks' vision was actually going to see the light of day, let alone become a success. However, between a persuasive VP of Marketing, the grand booth, and the look of the various models (there weren’t any fully functional production units yet), I agreed to an interview with Xplore's boss and committed to coverage in our print magazine.

And so this is what we ran in Pen Computing Magazine Volume 4, Number 15, page 41, in early 1997:

Xplore Genesys

Pen technology “dream team” presents impressive new system

Every so often, individuals — or groups of individuals — get dissatisfied with the status quo and set out to create new solutions, new forms of government, new companies. or whatever it takes to make things right.


One such group of individuals found the general status of mobile and pen computing lacking and joined together to form Xplore Technologies Inc. Xplore has an impressive roster of seasoned professionals from all areas of mobile computing, both on the vendor and customer sides. Founding members have earned their professional experience and reputations at companies such as GRiD, Telxon, Motorola, Intel, Fujitsu, Telular, and a number of vertical market industrial clients.


Fueled by a common vision of offering technologically advanced “whole product solutions," a firm belief in an annual pen computing market growth rate of over 30%, and financing by a small group of supportive investors, the Xplore team is conjuring up a compelling business strategy based on tactical partnerships with companies that provide products and services complementary to Xplore’s technology offerings.

Believing that existing pen and mobile systems often fail because they are created by technology companies without real knowledge of their markets, Xplore not only recruited industry representatives into their core staff, but also developed the specs of their “Genesys” product family in conjunction with customers from their targeted markets — utilities and public safety. The result is a very functional, very attractive design that’s both technologically up-to-date and ready for future expansion, a necessity in markets where equipment is expected to have a life cycle of several years.

The Xplore Genesys pen computer, much like the TelePad 3, is based on a main “brain," or core, that houses the main logic board, power, memory, and screen, and X-pods that contain peripheral functionality, such as GPS systems, additional batteries, wireless communications, and various I/0 options. The X-pod expansion bay is shaped so that it doubles up as an ergonomically shaped hand grip for the unit. Xplore calls both the core unit and the X-pods “environmentally indifferent,” i.e. water resistant, with shock mounted components in a sealed composite (or optional aircraft aluminum) inner housing for the core, and equally impressive sealing of the pods. The unit is further protected with impact resistant exterior moldings, all combining to give a Genesys computer a good chance to survive a 4-foot drop onto concrete.

As should be expected from a brand-new, "clean slate" design, the Genesys includes thoroughly modern components, starting with a very-low voltage lntel Pentium processor running at 133Mhz; two color including a TFT High-Brite version, and one monochrome LCD screen options, all offering a large 10.4" diagonal viewing area and 800 x 600 SVGA resolution; 64-bit PCI bus architecture; an electrostatic pen interface with touch screen functionality; no less than three docking systems; and — of course — Windows 95. There is room for up to 64MB of RAM and up to 3.2GB of hard disk space.

Since Xplore projects a significant number of Genesys slates to be vehicle mounted, special care was given to an optimally designed vehicle dock. The airbag zone compliant dock has a separate breakout box for cable management and uses standard PC connectors. The desktop dock provides access to CD-ROMs, LANs, modems, keyboards, and external monitors. essentially turning the Genesys into a fully functional desktop computer. The office dock, finally, is a space-saving design with a LAN controller that allows mounting up to four tablets on the wall for easy access. All docks are based on the same mechanical docking head, and all offer fast charging capabilities and expanded intelligence through an I/0 controller in the docking head.

As of this writing (January 1997), Xplore was in the process of assembling final Beta units for testing with a limited number of customers in February 1997. According to Xplore, production is scheduled to begin in late March.

Our impression? What we have here is a high powered group of very qualified people developing and marketing what they believe is the very best product for the pen computing and mobile market. This is good news for pen technology in general, and for companies seeking a state-of-the art mobile solution in particular. — Conrad H. Blickenstorfer, Pen Computing Magazine

And here's what it looked like in that early 1997 issue of Pen Computing Magazine:

It’s hard to believe that it’s been almost 20 years since I wrote that article. And pretty much exactly 20 years since Xplore began making rugged tablets. Back then, Xplore’s competition included Teklogix, Dauphin, DES, Epson, Granite, Husky, IBM, Itronix, Kalidor, Microslate, Mitsubishi, Norand, PGI Data, Telepad, Telxon, Texas Micro, WalkAbout and others. All gone, absorbed, or no longer in the rugged tablet business. Xplore, however, is not only still here, but expects fiscal 2017 revenue of between US$85 million and US$95 million. And Xplore is #2 in global rugged tablet marketshare. Quite impressive.

It hasn’t been an easy ride for Xplore. There was customers’ general reluctance to embrace the tablet form factor. There were the special demands of tablets that always seemed a year or two ahead of available technology. Despite Microsoft Windows for Pen Computing and then the Tablet PC Edition of Windows XP, Windows never was a natural for tablets. So business was hard, even after the iPad opened the floodgates for tablets.

Yet here Xplore is, now with the complementary product line of fellow tablet pioneer Motion, stronger than ever. It’s ironic that while once it was lack of acceptance of tablets that was Xplore’s biggest problem, now it’s the very success of tablets that’s a challenge — with tablets so cheap, many potential customers just buy consumer tablets and stick them in a case.

So after 20 years of making tablets and nothing but tablets, questions remain. On the far end, how rugged is rugged enough? What degree of ruggedness is compelling enough to sway possible markets, and at what price point? How can one profitably grow while remaining under the radar of consumer electronics giants (so they won’t start an “active” or “outdoor” or “adventure” version of one of their products)? None of these questions are easy to answer. Or the answers easy to implement.

But having been around for 20 years and having the benefit of all that experience, few are in a better position to succeed than Xplore Technologies. Here's to the next 20, Xplore!

Posted by conradb212 at 6:05 PM

August 9, 2016

Why we take things apart and show what's inside

At RuggedPCReview, we take things apart. We open up handhelds, tablets, panels, notebooks and industrial PCs. We dissect them methodically, documenting our progress, jotting down observations and commentary. What we find inside a product becomes part of our detailed reviews, including pictures of the insides and of interesting details.

We do this because ruggedness isn't something that's just skin-deep. Truly rugged mobile computing devices are designed from the ground up to be tough and rugged and being able to handle the various kinds of abuse they may encounter in customers' hands (and falling out of customers' hands). While the outsides of a successful consumer product must look good and appeal to the eye, a rugged product must look good inside, too, and by "look good" we mean designed and built to handle abuse. For us here at RuggedPCReview, that means it's mandatory to look inside and describe what we find. Else we wouldn't do our job.

We've felt this way for a very long time. Ever since, back in the mid 1990s, we reviewed a tough-looking tablet its manufacturer said was specifically designed for the military and operation under the harshest conditions. It looked very tough indeed, but when our editors took it apart, it was like a half-finished science project inside. There were wires and loose connectors everywhere, things were not fastened in place, seals were inadequate or non-existent, and the internal layout and organization did not make sense. There was no way that product was going to hold up out there in the field. Not surprisingly, that company went out of business shortly thereafter.

It was then that we decided to review what's inside a rugged device as carefully as we describe and document what's outside. We love taking pictures that show off a product out there in the muck, rain, water, snow or ice, because those are the extreme conditions rugged computing products are being designed for. But we also show what's inside. Because what's inside, the computer, is what the tough and rugged exterior must protect, and even the hardest shell cannot protect the guts of a rugged system if it's not designed and built right inside.

By and large, the guts of today's rugged products are far, far better than we've seen in the past. We used to see plenty of seals that could not possibly seal, plenty of connectors that could not possibly stay connected, plenty of parts that were certain to break, plenty of layouts that were too complex to work, and plenty of cooling systems that could not stay cool. We saw plenty of foils, conductive material, seals, screws and soldering that could not possible survive even the first time the unit was taken apart for repair or maintenance. We saw plastic clips that would break, screw sockets that would fail, seals done wrong (or omitted entirely), and materials that simply made no sense.

It is better now, and perhaps our many years of documenting and discussing what's inside rugged systems how they are made, has contributed in a small way to that progress. And even if not, it has probably helped raise awareness of interested parties in what's inside of all those important and often costly tools for tough jobs, tools that must not fail.

The vast majority of manufacturers we have worked with over the years understands that. Most take pride in the internal quality of their products and appreciate our documentation of the insides of their products with photography that's often much better than what even the FCC does.

But every once in a while, we're told we must not open a device or must not publish pictures of what's inside. Stated justification for the former may be that a unit is sealed and opening it would destroy the seal and reduce or eliminate ingress protection. We don't consider that a good argument for two reasons. First, we can't recommend a product when we're not even allowed to look inside. And second, if seals break when the unit is taken apart, that makes service expensive, difficult and inconvenient, big negatives all.

We've also had a very few requests not to publish interior pictures because then the competition would know how it's done and steal the design. That, likewise, we do not consider a good argument. If the competition is indeed concerned enough to want to know what's inside a product, they will simply buy one and see for themselves (that happens all the time, everyone does it). But what if designs are "stolen"? Still not a good argument; one cannot easily copy an entire design from a picture. We're not talking rearranging Lego blocks here.

By and large our experiences with the industry have been overwhelmingly good. Almost everyone is helpful and genuinely concerned about making the best possible products. Project managers, in particular, take great pride in the designs they are entrusted with. Most love to share, discuss issues, answer questions, and appreciate feedback. Most marketing people we work with are also great sources of information as well as helpful conduits to/from technical staff and PMs.

Reader and site visitor feedback is uniformly in favor of detailed reviews that show both the outside and the insides of the products they are interested in. It helps them make more educated purchasing decisions.

So that is why we here at RuggedPCReview take things apart and show what it looks like inside. We could save ourselves a lot of time and effort not doing it, but then we wouldn't be doing our job. And we wouldn't do a favor to manufacturers who often learn from our third-party analysis, and we certainly wouldn't do a favor to our readers.

Posted by conradb212 at 5:18 PM

June 29, 2016

The Microsoft Surface mystery

According to numerous reports online, Microsoft will apparently stop offering the Surface 3 tablet by the end of 2016 and it's not certain if there'll ever be a Surface 4. Microsoft, of course, has had a checkered past with its forays into hardware, and many of the company's hardware partners likely have mixed feelings about the Surface tablets that are direct competition for their own products.

Yet, the Surface tablets appeared to have been quite successful. After a rocky start with the wing-clipped Windows RT tablets, sales of Surface tablets running real Windows looked very good. Back in February 2016 we reported on IDC estimates of 1.6 million Surface tablets sold in Q4 of 2015. Most of them were Surface Pro models and not the lower end Surface 3, but anytime a tablet product line sells in the millions, we'd see that as a success. For the full fiscal 2015, Surface sales amounted to US$3.6 billion, a big increase over 2014's US$2.2 billion, which already had greatly exceeded 2013's sales of under US$1 billion.

So what do we make of that? Why would Microsoft give up on the Surface 3 and perhaps not even offer a Surface 4? Most likely because selling low-end tablets is just not profitable. The predicament's clear: with "white box" tablets practically being given away, and even brand name tablets selling for very little, price is an issue in the tablet market where differentiation in anything but performance is difficult.

And performance costs money. Even mid-range Intel Core processors cost hundreds of dollars at a time when even good Android tablets can be had for a fraction of that amount. So in order to compete, Windows tablets had to resort to low-end Intel processors, mostly from the Atom and perhaps Celeron lines. Some Atom chips cost no more than 20 bucks or so in quantity, and so there's plenty of temptation to use them in lower-end Windows tablets.

Which is almost always a big mistake. The mobile computing market is littered with failed products whose designers had tried to keep costs down by using cheap Atom chips. The low, low price of netbooks had seduced tens of millions, who then quickly found out that acceptable performance just wasn't there. Same with numerous rugged tablet designs, almost all of which ended up switching to higher-end processors.

So perhaps the retreat from low-end Surface tablets is just an admission that for general purpose Windows computing, low-end processors just can't hack it. So between unhappy customers on the one side, and unhappy beancounters who don't see much profit from these low-cost Windows tablets on the other, it's a losing proposition. This does not mean that Intel's non-Core chips are generically bad choices for personal computing devices, just that there's a very delicate tipping point where the plus of a low price is outweighed by the minus of lower performance.

So perhaps that means Microsoft will be concentrating on the higher end of the tablet market, where the Surface Pro models have done well and the profit margins are higher. Add to that the Microsoft Surface Book and the raising interest in 2-in-1 hybrid tablet devices, and Microsoft's retreat from low end tablets may just be a clever shift in market focus.

2-in-1s, of course, have their own issues. While a device that can both be used as a tablet and as a conventional laptop is a compelling idea, it's surprisingly difficult to implement. The concept has been around for a quarter of a century, but with few actual products that reached more than specialized niche markets. So anyone entering that market would be well advised to examine earlier 2-in-1s, and what kept them from breaking through.

Posted by conradb212 at 2:43 PM

May 24, 2016

Household items: coding, standards, and "2x" pics

Back in the day when we published Pen Computing Magazine and Digital Camera Magazine and some other titles in print, we always prided ourselves to be in total control of our own destiny. We did virtually everything inhouse — writing editing, photography, layout, prepress, web, marketing and advertising — and most of us had mastered several of those disciplines. We didn't want to farm anything out or rely on any one expert.

We felt the same about software. We had our own webhosting, our own servers right in our building, and we programmed everything ourselves. That way, no one could force us to update or upgrade when we didn't want to, no one could quietly put more and more unrelated ads, pop-ups and clickbait onto our pages, and no one could suddenly go out of business on us. No one could control us or buy us out either, because we financed everything ourselves. One doesn't get rich that way, but it pretty much guarantees continuity and longevity.

That doesn't mean we didn't run into issues. The author of a terrific piece of software that we used and loved was of the paranoid sort and, even though we paid a hefty price for the system, insisted on compiling everything and locking the software to our particular hardware. So every time we upgraded our servers even in a minor way, we had to go and beg the man for a new code. That became increasingly more difficult, and eventually he refused altogether.

Fortunately, that was an isolated incidence, which is a good thing as we use dozens of tools and utilities that run on our own server and without which we couldn't do business. Many are orphaned or haven't been updated in many years. But they still work, and they do the job better than what replaced them.

RuggedPCReview.com is a vast site with thousands of pages. Yet we don't use a content management system or anything like it. We handcode everything. Sure, we have utilities and scripts and routines that make the job easier, but when a new page goes up, it hasn't been generated by rev. 17.22 of version 5.21 of some corporate software system. It's all coded by hand.

Don't get the idea, though, that we're hidebound and unwilling to go with the flow. We routinely evaluate whatever new tools and systems that come along. A few years we analyzed HTML 5 and recreated part of RuggedPCReview in pure HTML 5. It was an interesting and stimulating exercise, and we adopted part of HTML 5, but didn't see a need to convert everything.

More recently we took a look at Word Press. Like Movable Type that we still use (and run on our own server), Word Press started as just blog software. It's now morphed into a full-fledged content management and site generation system, one that's replacing more and more conventional websites. As we had done with HTML 5, we analyzed Word Press and recreated RuggedPCReview in Word Press.

We rejected Word Press for a variety of reasons. First, we used tables everywhere, and Word Press is terrible with tables. Second, Word Press is based on modules that are pretty much black boxes. You don't know what they do and how (unless you want to dedicate your life to learn and decipher Word Press in detail). We don't like that. Third, Word Press layout is terrible. Even the best templates look like pictures and text blocks have randomly been dropped on a vast ocean of white background. And forth, and most egregiously, with Word Press sites you never know if an article or posting is current or three years old.

So thanks, but no thanks. Which means that when we need to implement a new feature on our site, we have to be creative. A couple of years ago one of our much appreciated sponsors was unhappy that sponsor logos were listed alphabetically, which meant that some sponsors were always on top and others at the bottom. A reasonable complaint. Word Press likely has some black box for that, or maybe not. Our solution was to find a script and modify it for our purposes. It's been working beautifully.

Technology advances at a rapid pace, of course, sometimes for the better and sometimes you wonder what they were thinking because what came before worked better. That's mostly the case with software; hardware advances are generally a good thing. But here are a couple of examples of how advances in hardware affect running a site like RuggedPCReview.

There was a time when the web was on desktop and laptop monitors, and phones either didn't have anything like the web, or some separate abbreviated version of it, like the unfortunate and ill-fated WAP mobile web that was on older feature phones. But with smartphones getting ever larger displays and ever more powerful electronics, there really wasn't a need to have two separate webs. Standard web browsing works just fine on phones.

Problem is that even a 5.5-inch screen like the one on the iPhone 6 Plus is awfully small to take in a webpage. You can, of course, quickly zoom in and out thanks to the wonders of the effortless capacitive multi-touch, but that, apparently was a thorn in the eyes of interface developers. So we're seeing all those efforts to make sites "mobile-friendly." The currently prevailing school of thought is to have sites consisting of blocks that arrange themselves automatically depending on the size and width of a display. So if you have three pictures next to one another on a standard desktop browser, on a smaller screen the three pictures will rearrange themselves and become stacked on top of one another. Same with text blocks and other site elements.

That may seem like a brilliant solutions to programmers, but it's a hideous aesthetic nightmare in the eyes of anyone who's ever done layout and crafted pages just so. The mere idea that this could be a solution seems preposterous. So we're staying away from that nonsense.

But there are other issues. One of them is resolution. There was a time when most desktop and notebook displays used the same resolution, with every few years bringing a new standard that would them slowly be adopted. 640 x 480 VGA was gradually replaced by 800 x 600 SVGA which, in turn, was slowly replaced by 1024 x 768 XGA. Handhelds were in their own category with proprietary screen resolutions (like Palm) or the 240 x 320 QVGA of Pocket PCs.

That first changed when "wide-format" displays became popular. Where once everything had been displayed in the same 4:3 aspect ratio as TV sets, various aspect ratios quickly became an additional variable. The tens of millions who bought early netbooks will remember how the 1024 x 600 format favored on netbooks awkwardly cut off the bottom of numerous apps that were all formatted for 1024 x 768. And so on.

Then something else weird happened. While desktop and notebook displays only very slowly adopted higher screen resolutions, resolution virtually exploded on smartphones and tablets. Apple's introduced the concept of "retina" displays where, when looked at from the typical viewing distance of a class of device, individual pixels could no longer detected by the naked eye. As a result, high resolution quickly became a strategic battleground with smartphones. That led to the interesting situation where many smartphones with small 5-inch screens had the same 1920 x 1080 resolution as 15-inch notebooks, 27-inch desktop monitors, and 65-inch HD TVs. And now there are smartphones with 4k displays. That's 3840 x 2160 pixels, the same as 80-inch ultra-HD TVs.

What that means is that ordinary websites must now display in at least reasonable shape and quality on a device spectrum that ranges from tiny displays with insanely high resolution all the way to much larger desktop and laptop displays with generally much lower resolution, and often still using the old legacy resolutions.

Which is especially bad for pictures. Why? Well, let's take a look at RuggedPCReview. Ever since we started the site in 2005, all pages have been 900 pixels wide. That's because we wanted to comfortably fit into the 1024 pixel width of an XGA display. What happens when you look at the 900-pixel site on a full HD screen with its 1920 pixel width? Well, boxes and text and such scales nicely, but pictures suddenly look much worse. Now go on a 3840 pixel wide 4k screen, and the pictures are hardly viewable anymore.

So what does one do?

Turns out this is an issue that's been hotly debated for several years now, but a common solution hasn't been found. I did some research into it, and there are literally dozens of ways to make pictures look good on various sizes of displays with various resolutions. They use different technologies, different coding, and different standards, which means most may or may not work on any given rev of any given browser.

In general, how can the issue be handled? Well, you could have high-res pictures, have the browser download those, and then display them at lower resolution if need be. Or you could use on of the image formats where the picture starts of blurry and then sharpens. If all the sharpness isn't needed, the browser could simply stop the process. Or you could download various versions of the same picture an then display the one that makes most sense for a given resolution. One thorny issue is that you don't want to download a high res picture when all you need is a much lower res version. That'd be bad for bandwidth and loading speed. You also don't want to first load the webpage and then have it sit there with none of the pictures loaded while the software decides which version of a picture it should load, or while it's milling to get the picture into the optimal resolution. It's a surprisingly difficult issue.

After having read a good many articles on the issue, I was about to give up because all approaches seemed too complex to make sense for us to pursue.

But then I came across a solution that made sense. It's the "srcset" attribute that can be used with the standard HTML code for displaying an image. The way this goes is that you tell the browser to display a picture the way it always does. But the srcset attribute now also says that if the screen of the device the picture is to be viewed has such and such resolution, or is so and so many pixels wide, then use the higher resolution version of the picture! That sounds a bit difficult, but it's made easier by the fact that modern browsers know whether they run on a "1x" screen (i.e. a good old-fashioned standard language display) or a "2x" screen (like, for example, the retina iMac27), or even a "3x" display (like a smartphone with insane resolution). Which means the browser only has to download one image, and it'll be the one that looks best on that particular display. Yeah!

There's one problem, though. And it's one that can be frustratingly difficult to solve. It has to do with picture size. Anyone who is familiar with modern compact or DSLR cameras knows that there is a huge difference in the size of a "raw" image and one that's saved in JPEG format. And also that pictures can be saved at various quality levels in the JPEG format. For most web display situations, the art of the deal is to compress as much as you can while still having a decent looking picture.

How does that affect having various versions of the same picture for different types of displays? Well, if a standard picture takes 50kb of storage space, a "2x" picture will take four times as much, 200kb. And a "4x" picture would weigh in at 16 times as much, or a hefty 1.6mb. It's easy to see that this can result in serious bandwidth munching and sluggish page loading.

Turns out, the human eye is very easily fooled, and so we can cut some corners, but it has to be the right corners. Trial and error revealed that with our RuggedPCReview site, saving a "2x" size JPEG in what Photoshop considers "low" quality at a "10" level takes roughly the same amount of storage as saving the same picture in a smaller "1x" size at "good" quality, which is Photoshop's "60" level. But doesn't the picture look terrible saved in such a high compression? Amazingly, not. It looks great, and much sharper than the higher quality low res picture. That means that while we still must create two versions of each picture, loading a page with a high-res picture on a high-res display takes no longer than loading the low-res picture!

That sounded too god to be true, but we tried it and it works. So from now on, whenever possible, all pictures used in new RuggedPCReview ages will have low-res and high-res versions of images.

Isn't technology great!?


Posted by conradb212 at 6:26 PM

February 23, 2016

Cat S60 — More than the naked eye can see

They used to say, and likely still do, that a picture is worth a thousand words. That's certainly true, but it can also be quite misleading as pictures often tell a story rather than the story. There can be quite a difference between these two. The media is very adept at using carefully chosen pictures that tell a story that may or may not be so, or present the story with a slant or an agenda. One could almost say that a picture can tell a story in a thousand different ways. And in the age of Photoshop (generically used here; any image program can do), that's more true than ever. And let's not even talk about videos.

There is, however, another aspect of pictures. Most only tell what the human eye can see. Light is electromagnetic radiation, and only a small part of it is visible to the human eye as colors. It's the part with wavelengths of roughly 380 to 750 nanometers. Below that is ultra-violet and then x-rays and other rays. Above that first infrared, then microwaves and radio waves.

I've always been interested in the spectrum beyond what we can see.

My initial degree was in architecture, and that meant understanding the principles of heating and cooling as well as energy conservation. While we humans primarily feel temperature, temperature can also be seen. Technologies that make infrared wavelengths visible to the human eye can show us temperatures as colors.

As an enthusiastic scuba diver I learned the ways light behaves underwater. Colors are different underwater because waves travel differently, and some wavelengths are filtered out by water sooner than others. Lower energy waves are absorbed first, so red disappears when a diver reaches a depth of about 20 feet. Orange disappears next, at around 50 feet. Then yellow at about 100. Green stays longer and blue the longest, which is why things look bluer the deeper you go. But it's not always like that. For example, I found that sometimes I could see red at depths where it was not supposed to be visible. I wrote about that in Red at Depth.

The image below shows the same coral head at a depth of about 90 feet without artificial light on the left, and with the flash on the right. Without the flash, the red on the left ought not to be visible at all. And yet it is.

Then there's the interesting phenomenon of fluorescence. Fluorescence essentially describes the physical process of a substance absorbing light at the incoming wavelength, and re-emitting it at a different wavelength. Almost everyone has seen the ghostly effects of "black light" that makes some materials glow brightly while leaving others unaffected. Through scuba I found that there's a totally fascinating world of fluorescence under the sea. I described that in Night Dives Like you've Never Experienced Before.

The image below shows an anemone we photographed with a yellow filter and a NightSea fluorescent protein flashlight. In normal light you'd barely see the animal, but it is strongly fluorescent and lights up under certain incoming wavelengths.

Having founded Digital Camera Magazine in 1998 gave me an opportunity to witness the progression of digital imaging and also the opportunity of hands-on with a large number of different cameras and lenses. That knowledge and experience not only in cameras but also the underlying imaging technologies led to an interest in emerging uses and applications. That included explorations in ultra-slow motion imaging for science projects with my son, and examining the emerging action photography and videography market (it's safe to say that I helped GoPro understand how light behaves underwater; see The GoPro phenomenon: what the world-beating little 1080p vidcam can (and cannot) do).

Below you can see me testing different color filters in a Northern Florida spring with a specially built rig with two GoPro Hero 3 cameras.

In my work as publisher of RuggedPCReview and before that Editor-in-Chief of Pen Computing Magazine, I came to appreciate thermal modeling and design in microelectronics where proper cooling and removal of heat generated by the processor and related circuitry is among the most important aspects of any mobile computing design.

That's where infrared imaging comes into play. Human eyes and normal cameras cannot see infrared. Older generations will remember that infrared, commonly referred to as IR, was widely used for data communication between computers and devices, and it is still used in many remote controls. Older audiences will also remember the "Predator" movies where those merry human-hunting aliens saw the world in infrared. Infrared, or thermographic, cameras have been available for decades, but at a high price.

Recently, a company named FLIR Systems changed all that with a series of much lower priced thermographic cameras as well as the FLIR One thermal imaging camera module that snaps onto an iPhone or Android device. Not having a review relationship with FLIR, I pre-ordered FLIR One for my iPhone 6 Plus, and it arrived late 2015.

The FLIR One is an absolutely revolutionary product in that it lowers the cost of very functional thermal imaging to around US$250. The way the FLIR One works is that it shoots both a thermal image and a conventional picture. Why the conventional picture? Because thermal imagery doesn't provide much physical detail and it can be difficult for our eyes, unaccustomed to thermal data, to interpret the image. So what the FLIR One does is extract line-drawing type of detail from the conventional image and then merges it with the thermal image. That makes it much easier to exactly see what the thermal data pertains to. The FLIR One does all of that automatically.

When you show an IR picture to an uninitiated person, they will almost always assume it's just a Photoshop filter applied to a regular picture. But that's definitely not so. The thermal camera records data readings and then displays those on a color scale. FLIR One users can select from various color schemes. Since most people associate blue with cold and red with hot, I usually use the blue-red scheme.

What can you use the FLIR One for? Well, the applications are almost endless. The architect in me began a thermal imaging review of my home, identifying the efficiency of insulation and the presence of leaks. The scuba diver in me donned full scuba gear and examined hot and cold spots as those can definitely be an issue on a deep cold-water dive. And the reviewer of advanced rugged mobile computing products in me, of course, instantly began examining the thermal properties of gear in our RuggedPCReview testing lab. It's fascinating to see the heat signature of a mobile computing device, how it changes as the device heats up, and get an overall idea of how well the designers and engineers handled the thermal aspects of their products.

Below are some example of images taken with the FLIR One iPhone module. The image on the left shows a heater floating in an iced-over Koi pond. Dark blue is the ice on the surface of the pond, orange and yellow the warmer water kept ice-free by the heater. The image on the right shows the thermal design of a RuggON PX-501 tablet (see our full review). Yellow shows the heat-generating CPU and ancillary circuitry, as well as the copper heat conduit to the fan.

The two pictures below help architects, home owners and contractors. On the left, it's obvious which part of an attic room needs better insulation. On the right, you can literally see the cold air wafting in between two windows on an icy December day.

Is the FLIR One perfect? Not yet. While quadrupling thermal resolution over earlier low-cost efforts, it's still only 160 x 120, far, far less than the typical 8-16 megapixel recorded of the visible light spectrum. You don't need nearly as much resolution to convey useful thermal imagery (640 x 480 is considered high-res) and so the current low res is not a big problem. And now that FLIR has gained a foothold with the FLIR One and similar offerings, we'll likely see higher resolutions very soon.

But my story doesn't end here. In fact, you could say everything above is just an introduction to the news I had wanted to write about, the Cat S60.

Normally, we consider ourselves pretty well informed about anything in the rugged computing industry, but the Cat S60, officially introduced February 18, caught us by surprise. What is the Cat S60? Cat? Yes, Cat as in Caterpillar! Caterpillar actually sells a line of rugged smartphones on catphones.com. They all look suitably rugged, they all sport the familiar Cat logo, and their design language is reminiscent of Caterpillar's earthmoving machinery.

What immediately came to mind were branded signature edition of rugged hardware, like the "Hummer" version of one of the rugged Itronix notebooks several years ago. So did Caterpillar actually also make smartphones? Not really. Turns out that the Caterpillar phones come courtesy of UK-based Bullitt Group, a privately held technology company that apparently works with a number of tech companies. From what I can tell they either license famous brand names or work in joint ventures. As a result there are JCB phones (JCB is a British heavy equipment manufacturer), Kodak Phones, and Bullitt has even licensed the Marconi brand from Ericsson to launch a range of radios named after the very inventor of radio. Bullitt does about $100 million in sales a year, not tremendous, but very respectable.

What's really interesting, though, is that the Cat S60 is not just a rugged smartphone built to benefit from the Caterpillar image and name. No, it's actually what Bullitt calls "the world's first thermal imaging smartphone" and it has a built-in FLIR imaging camera. So you get thermal imaging built right into your rugged smartphone. The Snapdragon 617 octa-core powered Android Marshmallow phone itself has a bright 540 nits 4.7-inch procap display that can handle wetness and gloves. Ruggedness specs are quite impressive with a 6-foot drop, and what appears to be IP68 sealing. The Cat S60 is said to be waterproof down to 17 feet for an hour, and its 13mp documentation camera can supposedly be used underwater (see Catphones media release).

Wow. That is impressive indeed. Having a rugged smartphone with integrated thermal imaging capability opens up entirely new applications and functionality in any number of areas. How is this possible? With FLIR's tiny Lepton longwave infrared sensor that's smaller than a dime. For those interested in all the detail, here's the full FLIR Lepton datasheet in PDF format. Resolution of the initial Lepton imager is limited to 80 x 60 pixel, the same as in FLIR's iPhone 5 camera module, and entirely adequate for thermal imaging on a small smartphone screen. How much does the CAT S60 cost? US$599. Which seems almost too good to be true.

This is all very exciting. I don't know what Caterpillar's reach is in phones, given that we had actually never heard of their phone operation. Then again, I yesterday had a meeting with my landscape architect, and the man had not only heard of, but was very interested in the new Cat S60. I can absolutely see how offering rugged handhelds or tablets with an integrated FLIR Lepton camera can be a strategic advantage, especially if bundled with thermal imaging demos and apps. And having that kind of functionality in a product would not only be of great interest to many customers, but also definitely gold for marketing.

Posted by conradb212 at 3:28 PM

February 15, 2016

Keeping an eye on the level of technology offered in consumer tech: Dell Venue 8

The consumer market is really, really tough. Sure, massive fortunes can be made off it thanks to the sheer size of it, and thus the potential of millions of units sold. But few products ever make it into that sales stratosphere, and the competition is brutal. Make one mistake, be it in technology, manufacturing, marketing or just about anywhere else, and the product tanks, expensively. Add to that the fickle taste of consumers, the unpredictability of trends, a lightening-quick product cycle pace, and the true successes are few and far between. Leaving some very good and often excellent products behind. Or at least underappreciated.

That all came to mind as I spent the weekend playing with my latest impulse buy, a 7000 Series Dell Venue 8 tablet. First available early 2015, the Venue 8 is an 8.4-inch consumer tablet that's part of Dell's efforts to establish itself in mobile technology. That effort had seen the short-lived Dell Venue smartphones and then a re-introduction of the line late 2013 as tablets. Venues tablets are available both in Android as well as Microsoft Windows versions, the latter as the Venue Pro.

So why did I get a Venue tablet? In part because I've always liked Dell. I liked the old Dell Axim handhelds of the Pocket PC era, I like the various rugged Dell tablets and notebooks we've tested here at RuggedPCReview.com, and I liked Dell's decision to take the company private so as not to be at the mercy of Wall Street analysts whose quarterly growth expectations must be met lest they "worry" or, worse, become "concerned."

In this instance, I received an email alerting to a special deal on the Venue 8. I decided to check it out as the trusted Google Nexus 7 I'd been using as my personal small Android tablet, and also as a point of reference whenever we test a rugged Android device, had outlived its usefulness. After the latest OS upgrade — which to Google's credit was always quickly available for the Nexus 7 — the device had become so sluggish as to be useless. Could I not just ask one of our sponsors for a long-term loaner? I could, but I always like to have something that's truly mine and not subject to a sudden unexpected recall for inventory purposes or such.

Add to that that the deal was very sweet. Just US$199 for a 16GB Dell Venue 8 7840 running Android 5.1 (Lollypop). That's $200 off the regular US$399, and shipping included. Fast shipping, too, as the package arrived at my doorstep just a day and a half after I ordered the tablet.

Now, since I am writing this for the RuggedPCReview.com blog, let me make something clear right off the bat: the Venue 8 is NOT a rugged device. It's of the standard consumer/business variety that Dell has always specialized in. So why the write-up if it's not a rugged device? Because it's always good to see what consumer technology is up to and what consumers expect, and get, for their money. With the massive global reach of smartphones and tablets, what consumers expect from their personal gear has a direct impact of what they expect from rugged gear. So there.

If that's the case, and according to our experience it is, then every manufacturer of rugged mobile computing gear should get a Venue 8 and study it. Because consumers get an awful lot of very advanced technology with this tablet, even at its US$399 list price.

First, there is the 8.4-inch display with massive 2,560 x 1,600 pixel resolution. That's 359 pixels per inch and incredibly sharp. It's an OLED (organic light emitting diode) screen that's also vivid with deep blacks and intense colors. The display has absolutely perfect viewing angles from any direction. Brightness ranges, depending on what review you want to believe, from 250 to 430 nits. I consider it very bright. The folks at Anantech actually didn't like the display very much in their review of the Venue 8 (see here). But compared to most displays in rugged devices, it's very, very good.

For a processor, the Venue 8 has a quad-core Intel Atom Z3580 with a maximum burst frequency of 2.33GHz. The Z3580 is part of the "Moorefield" lineup of Atom chips and designed specifically for smartphones and tablets. It's different from Bay Trail chips in that it's not using Intel HD Graphics, but a PowerVR Rogue G6430. There's 2GB of RAM and 16GB of eMMC mass storage, plus up to 64GB via externally accessible micro SD card. There's Bluetooth 4.0 and fast 802.11ac WiFi via an Intel 7260 WiFi + BT 4.0 module. And the 21 watt-hour battery is supposed to last 9.5 hours.

The Venue 8 has not just two, but four cameras. Yes four. There is the standard frontal conferencing cam (2mp), there is the 8mp rear-facing documentation camera, and then there are two supplementary 1mp cameras that work in conjunction with the 8mp to allow depth measurement as well as adjusting focus on a picture after it's been taken. You don't see that anywhere else.


The whole thing is packaged into an anodized aluminum case that measures 8.5 x 4.9 inches and is less than a quarter of an inch thick. This tablet makes the sleek iPhone 6 look a bit stout. Weight is 10.75 ounces.

So how well does the Venue 8 work?

Very well. The Venue 8 has a exceptionally high quality feel to it, in that Apple-esque way that makes it feel like the device is milled from a solid block of metal. And the OLED display is just gorgeous with its rich, vibrant colors and deep blacks. It's like looking at a particularly good Plasma TV compared to regular LCD TV. I/O is minimal. There's the tiny micro-USB jack for charging. There's the micro SD card caddy. A headphone jack. And then the small volume rocker and power switch.

What's a bit unusual is the asymmetrical layout with 3/16th of an inch bezels on three sides and then a much heftier 1-3/16 on the 4th. That's good news for the speakers which get much more real estate than they usually get in a small tablet, and they face forward for good sound. That makes for a largish area where to hold the tablet, but both the front and the rear main cameras are also located there, which means it's easy to inadvertently cover them. Overall, I like the arrangement.

In terms of performance, the Venue 8 is quick. Overall, few of the Android devices that I've tested or worked with are as smooth as anything that comes out of Apple, with a degree of stuttering here and there common. There's very little of that on the Venue 8. It's generally quick and responsive, and a pleasure to use.

Literally billions are familiar with Android now, which means that whatever quirks Android has — and it still has its fair share of them — do not affect its popularity. The vast variety of Android hardware and the numerous versions of Android itself, however, still mean that some apps are either not available for a particular device or version, or they are not optimized for your particular device.

Some reviewers have complained about very small text and icons on the Venue 8 due to its very high resolution on a fairly small display. I did not find this to be an issue under Android Lollipop, and certainly much less of an issue than it is on most small Windows tablets.

The "depth" camera assembly with its main 8mp camera flanked by two 1mp complementary cameras has me baffled. The idea here is that the two subsidiary cameras allow to capture depth information that can then be used to do a variety of things to a picture, like focus on certain areas, applying filters to certain areas, or even measuring distances and areas. It can also be used to widen or narrow the depth of field for artistic purposes.

Unfortunately, this is only marginally documented, and didn't work all that well for me. On top, the 8mp camera itself isn't nearly as good as most people have come to expect from their smartphone cameras. So that's a disappointment. It does, however, still work better than most cameras in rugged systems. I understand the need to differentiate a product from the competition, but in this instance I'd have preferred one really excellent documentation camera instead of the triple camera experiment.

Battery life is excellent, especially considering there's only 20 watt-hours, and the device is less than a quarter inch thick. Battery life is so good that, like in iPads, it simply ceases to be an issue.

So what does it all mean as far as rugged computing is concerned? Since it's not a rugged device, nothing directly. However, the Venue 8 demonstrates the level of technology and features that's available to consumers for very little money. And the impressively high quality. The vibrant OLED display with its very high resolution. All for US$399 list, or the ridiculously low US$199 on special.

And what does it mean as far as manufacturers of rugged tablets are concerned? Simply that the consumer market is spoiling consumers with advanced technology that's hard to match in low-volume ruggedized gear with much longer product cycles. So I don't expect to find all this enticing high technology in rugged computing products for the job. But it definitely IS good to keep an eye on what consumers are getting for very little money. Because those consumers then want the same in their professional gear.

Posted by conradb212 at 7:55 PM

January 19, 2016

Follow-up on iPad Pro and Apple Pencil

I've now had the iPad Pro for a good couple of months and the Apple Pencil for a month and a half. How do I use them? Have they changed my life?

As far as the iPad Pro goes, it has totally replaced my iPad Air 2. I don't think I've used the Air 2 once since I got the Pro. However, I am doing the exact same things on the Pro that I used to do on the smaller Air 2. The split screen functionality is not good or compelling enough to really work with two apps at once, and it's nowhere near universally supported.

So I use the Pro just as a larger iPad. Although the Pro is significantly heavier than the Air 2, and almost a bit unwieldy, apparently the bigger screen and the fact that it's a good bit quicker than the Air 2 are enough for me to use the larger Pro.

I'm disappointed that there really are no apps that are truly "pro" in the sense that they add undeniable value to a larger device, make it a professional tool instead of just the device that the iPad always has been. For now, there really is no difference.

How about the Apple Pencil? As someone who has worked with and written about pen computing technology for over 20 years, I should be a primary candidate for the Apple Pencil. I should be thrilled that Apple is finally recognizing the pen as the important productivity tool it can be.

But I am not.

I played around a bit with the Apple Pencil when I first got it, but haven't used it since. That's not because I am no longer a fan of pens. It's because Apple just didn't get it right. The Apple Pen is too large, too slippery, and too poorly supported. You never know if an app will really support it or just part of what the Pencil can do.

And having a pen with a battery is just unappealing, especially when its primary charging mechanism, to stick the pen into the Pro's Lightning connector, is just too bizarre. As is, when I look at the Pencil and feel like I want to try it again, the first thing that comes into my mind is that it probably needs charging first. And I move on.

Add to that the fact that there's no garage for the pen on the big Pro, and the $99 Pencil seems almost like an effort by Apple to emphasize Steve Jobs' point: we don't need a pen!

All this baffles me. I really wanted to like the Pencil. But the way Apple went about it is like Microsoft went about the Kinect. An expensive add-on that shows flashes of brilliance, but overall just doesn't work well enough for people to want it.

Posted by conradb212 at 10:04 PM

December 6, 2015

An assessment of the Apple Pencil

A few weeks after the Apple iPad Pro began shipping, the Apple Pencil is now available also. This is big news because it was Apple who finally made the tablet form factor a success, and they did it without a pen. Which is remarkable, as tablet computers initially were conceived specifically as a modern equivalent of a standard notepad that you wrote on with a pen. And remarkable again as Steve Jobs was adamantly opposed to pens and often listed his reasons why he felt that way.

But now the Apple Pencil is here, a good 5-1/2 years after the iPad was first introduced. Why did Apple do it? Perhaps it's because Samsung has been selling tens of millions of their Note tablets with pens. Perhaps it's because being able to do quick illustrations or annotate text documents with handwritten notes simply makes too much sense to ignore. Perhaps it's a tacit acknowledgment that fingerpainting is limiting when it comes to anything but the simplest of artistic expression. Perhaps it's because the big iPad Pro simply needed something to differentiate itself from smaller and lesser iPads. Or perhaps it's all of the above, or something else entirely. Fact is, the Apple Pencil is here.

The big question now is how, and how well, the Apple Pencil works, and what it might mean for Apple. After all, the pen is what Jobs felt was so unnecessary.

A brief history of using pens with tablets

But before I go into my own opinions and experiences with the Apple pen, I want to outline the big picture. As stated above, tablets were initially conceived as a modern day replacement of pen and paper. And they've been around for over a quarter of a century. Contrary to what a lot of people believe, Microsoft did not invent the tablet in the early 2000s. Already back in 1989, the Momenta tablet was available, and it sparked great excitement over a future where tablet computers you could write on with a pen would replace the conventional PC.

In the early 1990s, every major computer company offered a tablet. At a company named GRiD, Jeff Hawkins (who would later invent the Palm Pilot) designed the GRiDPAD and the PalmPad. NCR had the NotePad, Samsung the PenMaster (see below), Toshiba the DynaPad, IBM the ThinkPad (1993 IBM ThinkPad 730t shown on the right), and there were many more, all tablets operated with a pen. Many of those tablets initially ran the novel PenPoint operating system specifically designed for tablets and use with a pen.

Unfortunately, while there was tremendous hype around those early 1990s tablets, they failed to become a commercial success for a variety of reasons. One was that the technology just wasn't there yet to create a tablet light and functional enough to be of much use. More importantly, supporters of the tablet concept back then took the idea of electronic pen and notepad too literally. Their central thought was that everyone knows how to use a pencil, and everyone knows how to write. So we simply use computing power to translate handwriting into text. The tapping, panning, pinching and zooming of modern tablets simply never entered the equation because a) it wasn't possible back then, and b) people didn't do that on pen and paper pads, so why do it on a computer?

As a result, early 90s tablets completely relied on pens. If the pen failed to work or was lost, there was no touch functionality as a backup. The tablet became useless. Likewise, if the computer failed to understand the handwriting or gestures written on it, which was often the case, it was useless.

That quickly dampened the enthusiasm for tablets, and the fact that Microsoft fought against the new pen-centric operating systems tooth and nail didn't help either. It was a war Microsoft won. Its token-effort "Windows for Pen Computing" prevailed against the far more innovative PenPoint OS.


That did not mean Microsoft had no interest in pens. After the Apple Newton failed in the mid-90s due to its exclusive reliance on handwriting recognition, Microsoft's own Windows CE achieved modest success on small handhelds that primarily used the pen as a mouse replacement. Microsoft then followed up with its 2001/2002 Tablet PC initiative (NEC LitePad shown on the right) that used active pens with significant built-in support from the Windows XP Tablet PC Edition. Handwriting recognition and gestures were available, but hardly played a role at all.

The Microsoft-spec Tablet PC failed because, again, it used the pen simply as a comparatively clumsy mouse replacement on Windows, an OS completely designed for use with a mouse. Plus, it was all too easy to lose the (still expensive) pen, and there was no intuitive finger tapping, panning, pinching and zooming as a backup and complement for the pen. Small wonder that Microsoft itself switched its emphasis to convertible notebooks instead of pure tablets before the "Tablet PC" was even officially launched.

Apologies for the long history here, but it's necessary to understand all this when assessing the Apple Pencil. Has Apple learned from history, or will the Pencil fail because it's making the same mistakes all over?

Has Apple learned from pen history?

So given all that, given that Apple showed the remarkably prescient "Knowledge Navigator" almost 30 years ago, given that Apple had the Newton over 20 years ago, and given that Apple has all the engineering and human interface design expertise in the world, how did Apple implement the new Apple Pencil for the iPad Pro? And what kind of technology did they use to make it work?

The first thing you notice about the Apple Pencil is that it's very long. A full seven inches. Most writing implements are about five inches long although, in fairness, an actual lead pencil is also about seven inches long. Still, it's not clear to me why Apple made the Pencil that long.

Most likely the space is needed for the electronics inside. Apple is very good at miniaturizing everything, but the Apple Pencil is a remarkably complex piece of equipment. The folks at fixit.com tore one down (see here) and found not only a fairly long rechargeable (but non-replaceable) 0.33 watt-hour Li-Ion battery, there's also a full logic board with an ARM Cortex-M3 processor, a Bluetooth chip, a variety of sensors and more. There's an awful lot of tech in there, and so for now Apple perhaps simply needed that much space.

Yes, there's a battery. Which means the Apple Pencil must be charged. This in complete contrast to the active pen technology that has ruled supreme since the early 1990, Wacom. Slender, lightweight Wacom active pens have been around since the very first pen tablets, and they've beaten all competing technologies over the past 20+ years. Microsoft pretty much built its 2002 Tablet PC around the Wacom pen. The image below shows the Apple Pencil and the Wacom pen used in a 2003 Toshiba convertible tablet.

Wacom's success was, for the most part, because the Wacom pen does not need a battery. Every other active pen technology does. Microsoft's N-Trig pens used with its Surface tablets need a battery. Since the battery in the Apple Pencil is non-replaceable, how is it charged? Amazingly and almost unbelievably, this way:

Can anyone say "recipe for disaster?" This has got to be one of Apple's worst ideas ever. It's in the "What were they THINKING?" category. The combination of a fragile Lightning connector and a seven inch lever sticking out from it is virtually guaranteed to eventually result in damage.

Fortunately, the Apple Pencil comes with a tiny little (and thus eminently losable) adapter so you can also charge it via a standard Lighting-to-USB cable.

Now how about two of the other big problems with pens in the past, those being cost and losing the pen? The news is not good here. At US$99, the Apple Pencil costs more than a low-cost tablet and it's certainly not a throw-away item. It's also slippery, doesn't have a clip to attach to a coat pocket (for which it'd be too long anyway), and thanks to the super-slender design of the iPad Pro, there's no garage in the tablet, or any other way to attach the pen to the tablet. Apple should have come up with some sort of solution for that.

Apple Pencil technology

Now what about the technology? Wacom technology works with a powered digitizer grid that can sense the proximity and location of the pen behind the LCD. Resistive pens need actual touch. Capacitive pens in essence emulate a finger tip which changes the capacitance between two electrodes. Experts I have talked to have long thought that a combination between the Wacom (or another active pen technology) and capacitive touch would be the ultimate answer to adding fine detail movement to capacitive multi-touch. But some have recently changed their opinion and now see advanced capacitive pens as the likely winner. An example of the latter technology is the superb capacitive pens that come with some Panasonic tablets.

Apple, however, apparently took a different approach. The Apple Pencil communicates with the tablet via Bluetooth. Which means Bluetooth must be on, with all that that entails. How exactly the Pencil works I don't know yet. I've seen sources that say the Pencil's technology is such that the iPad knows whether a touch is from the Pencil or from a finger, therefore making it possible to process Pencil touch in very different ways. Add to that the sensors present in the Pencil, and things like the much heralded "shading" become possible when the Pencil is held at an angle.

Clearly, wherever there is much tech involved, amazing things can be done, and the Apple Pencil is headed that way. But amazing things generally need a lot of support, and that support will take a while to materialize. As is, the Apple Pencil can be used to tap and pan anywhere, and it works in virtually all applications that accept inking. But the sexy shading requires special support, and for now it's a trial and error process figuring out which apps and tools support what.

If you search for "Apple Pencil Support" in the app store, nothing comes up. If you look for "Apple Pencil apps," a large number show up, but it's not clear how and to what degree the Pencil is supported.

How well does the Apple Pencil work?

How well does the Apple Pencil work in general?

Quite well. Playing with various drawing and sketching apps and exploring the tools with the Apple Pencil is a pleasure. Ink goes on smoothly and elegantly, and it's easy to see what the Pencil can do in the hands of an artist who invests the time to learn all aspects of all the tools. We're certain to see some amazing artwork created with the Apple Pencil.

However, pretty much anything I managed to do with the Apple Pencil I've been able to do for many years with a 20-year-old Wacom pen. Using a Wacom pen on a Microsoft-style Tablet PC delivers wonderfully smooth ink and virtually no lag, same as the Apple Pencil. Those older pen, too, have erasers, pressure sensitivity and they can do amazing tricks. So for now I cannot exactly view the Apple Pencil as a stunning leap forward.

The Apple Pencil hasn't resolved some old pen technology issues. Even Apple claims "virtually" no lag on its own promos, as there still is a bit of lag. The human mind has a very firm expectation of real-time versus lagging response (think of how weird even a slight lip-sync delay is in video and movies), and with rapid movement of the Pencil, there's definitely a sense of delay.

Worse, for me, is the on-off delay in very small movements, like when an artist wants to add minute tiny touches here and there. In the apps I tried that with, there often was a slight delay before the touch registered. What that means is that with a real pencil you absolutely always know what to expect. For now, with the Apple Pencil, not so much.

Finally -- and that's another area where the pen-and-paper metaphor breaks -- when we write on paper with a pencil, we expect the relatively coarse feel of paper. When we pan and swipe on a modern tablet we want as little "stiction" as possible for best operation. Alas -- you can't have both. As is, between the thickness of the glass on the tablet and its super-smooth feel, working with the Pencil feels like writing on a piece of glass. The brain doesn't like that. The brain expects the friction. Likewise, when our eyes look at the Pencil operate from an angle, the brain says, "Wait a minute! I can feel the Pencil touch the surface, but the writing is a fraction of an inch away from the tip!" That's parallax, of course.

Handwriting recognition

It's hard to imagine that there was a time when there was fierce competition in the handwriting recognition software market, with various approaches and technologies seeking to address all sorts of writing needs. At one point I had nearly a dozen of different recognizers working on a Compaq Concerto tablet convertible that served as my main mobile system for a while in the mid-90s. Today, recognition is essentially a non-factor, what with so many more people keyboard-savvy, and onscreen keyboards working so well.

But handwriting recognition software is still available. In fact, the roots of my favorite app, PhatWare's WritePad, go way, way back to the beginnings of handwriting recognition, and the technology has been refined ever since. It works beautifully on the iPad Pro.

Whether or not handwriting recognition will make a comeback is anyone's guess. Between the maturity of the software and the vastly improved capabilities of hardware it certainly works very well. One thing I found distracting is the clicking noise that the hard pen tip on the glass surface of the iPad Pro makes when handwriting. That was never an issue with the soft display surface of the Apple Newton, but such are the pros and cons of various technologies.

Apple Pencil: Too early to tell

Having worked with pens and tablets for 25 years, I want to like the Apple Pencil. I do believe a pen can greatly add to the iPad experience. But for now, and that is so highly unlike anything from Apple, what comes to mind is "the good, the bad, and the ugly." I love how well the Pencil works and the promise it shows. I am not fond of the price, the length, the ease of losing it, the battery, the ungainly looking tip and the uneven app support. And I am baffled that Apple thinks sticking the Pencil into an iPad port to charge it is even remotely a good idea.

So for me, the jury on the Apple Pencil is still out. But I am certainly glad Apple made it available. -- Conrad H. Blickenstorfer, December 2015)

Posted by conradb212 at 6:36 PM

November 20, 2015

Will the Apple iPad Pro herald an era of "pro" use of tablets?

My iPad Pro came in and I want to share my first impressions, and also my thoughts on where all this is leading.

Anyone who first sees the iPad Pro will immediately notice its size. The iPad Pro is big. 12 x 8.7 inches versus 9.4 x 6.6 inches for the iPad Air 2. So the iPad Pro's footprint is 68% larger than that of the standard iPad. Amazingly though, at 0.27 inches the iPad Pro is barely thicker than the iPad Air 2, which is 0.24 inches thick. And even more amazingly, the big iPad Pro is actually thinner than the sleek and slender iPhone 6S Plus (0.29 inches). In terms of weight, the iPad Pro comes in at a very manageable 1.57 pounds, just barely more than the original iPad, which weighed a pound and a half.


That said, its sheer size remains the iPad Pro's most obvious feature. The picture below shows the (already rather large) 5.5-inch iPhone 6 Plus, the 9.7-inch iPad Air 2, and then the new iPad Pro with a diagonal screen size of 12.9 inches.

What about some of the other display specs? The 5.5-inch iPhone 6S Plus has 1,920 x 1,080 pixel resolution, which means 401 pixels per inch (ppi). The 9.7-inch iPad Air 2 weighs in at 2,048 x 1,536 inches, for 264 ppi. And the new 12.9-inch iPad Pro sports 2,732 x 2,048 pixels for the same 264 ppi. Super-sharp, all of them. And how bright are the displays? The iPhone 6S Plus is around 500 nits, the two iPads both around 450 nits. Plenty enough even for occasional outdoor use.

Setting up the iPad Pro was as simple as it gets. I backed up my iPad Air 2 to my iMac, then went through the setup process for the iPad Pro and selected "restore from backup." That took less than 20 minutes. Then the iPad Pro came up and looked just like my iPad Air 2. But not totally. Only about 2/3s of my apps were loaded, the rest I had to restore from the Apple Store myself. And whatever required a password on the existing iPad required the password again on the iPad Pro. Else, everything was there. Kindle opened the book I was reading on the same page, and every webpage I'd had open on the iPad Air was also open on the iPad Pro when I first launched Safari. This ease of upgrading is something that I always loved about Apple.

I had feared that some apps might look terrible on the larger screen, the way iPhone apps looked terrible on the original iPad which had not only a much larger screen, but also much higher resolution. Since the resolution of the iPad Air 2 and iPad Pro is the same, nothing looks unusual, but many apps look sort of too spread out on the much larger screen.

You do get used to the larger display size very, very quickly. After a couple of hours of checking out apps and just doing my regular iPad routine, the Pro, while still feeling unquestionably large, felt right, whereas the iPad Air suddenly seemed to have shrunk.

But does one really need such a large screen? As is and for now, it seems like a luxury. The vast wealth of existing apps have all been designed and optimized for the 9.7-inch iPad display, so it's not as if using the smaller screen was an inconvenience. Not in the way running classic Windows, designed for a big desktop display, can be a challenge on a small mobile device screen.

Where the larger screen will certainly come in handy is in split screen operation. Splitting the screen on the 9.7-inch iPad made for a lot of eye squinting and panning around. On the iPad Pro, working with two apps feels almost like having two iPads side-by-side. The picture below shows the iPad Pro in landscape mode with Excel and Safari side-by-side. Each half is plenty large enough for comfortable viewing.

Problem is, there aren't many apps that support the full split screen yet. By full I mean 50-50 instead of just 2/3-1/3 (sort of like Microsoft introduced with Windows 8) that many apps are limited to. And sifting through an alphabetically ordered column of apps to pick the one you want in a split part of the screen is hardly the best way to get things done.

Another issue is that apart from the bigger size, there isn't really anything "pro" about the iPad Pro yet. I searched for "iPad Pro" in the Apple store, but couldn't find anything truly seemed to be for the iPad Pro, or take advantage of it. Not yet anyway.

One truly weird thing is that splitting a screen into two is sold and lauded as a marvelous technological advance. What?? For the past 30 years we've had as many resizable windows to work with as we wanted and needed, and now splitting the screen into two is news? No way.

There is, of course, the pen. Sadly, I'll have to wait for mine another three weeks or so, although I ordered it together with my iPad Pro. I did get a chance to play with it in the local Apple store. There's good news and not so good news.

The not-so-good news is much the same that sank the first generation of pen computers in the early 90s and limited adoption of the second generation in the early 00s. The Apple pen (Apple calls it "Pencil") is very expensive (US$99), very large (bigger than a full-size pencil and with an oddly fat tip), and very easy to lose (it's slippery, has no clip, and there's no garage for it on the iPad). Worse, it needs a battery, which here means recharging it via the iPad's port where it looks like it could very easily get damaged. And anything that must be charged will inevitably come up dead when you need it most.

All of the above, and a few other reasons, is why Steve Jobs was adamantly opposed to pens.

The good news, though, is that the pen works very well. Though the one I tried had somehow disconnected itself from its iPad and needed genius intervention, once the pen and tablet talked, the pen worked as smoothly and effortlessly as can be. Pressure results in thicker or thinner lines (hence probably "pencil"), the pen never falls behind, and the much advertised shading when you hold the pen at an angle is marvelous indeed. If all of this sounds the Apple Pencil is mostly for artists, that may or may not be so. I still love to jot quick notes on whatever scrap of paper is around, and with the right app scribbling away on the iPad Pro could be a boon. The question then is whether that's enough of a use to justify a thousand dollar tablet and a hundred dollar pen.

Few remember today that handwriting recognition was the central concept of the first generation of pen computers. The idea was that everyone knows how to write, whereas not that many could type very well. So let's replace that clunky old keyboard where you have to learn the layout of the keys with a simple pen. The computer then recognizes the writing, and no more need for a keyboard, not even onscreen. Sadly, that never really worked out. It worked for a few (including me), but no computer was fast and smart enough to make sense of the kind of careless scribbling most of us commit to paper. And editing via pen was a pain, too, almost as much as editing via voice (which is why pure voice recognition also isn't winning any popularity contests).

But might useful handwriting recognition be a reason to have the Apple Pencil? That's quite possible. Apple owns excellent intellectual property in the form of the "Rosetta" recognition technology of the late Newton MessagePad that became available as "Inkwell" in the Mac OS. Whether or not this will amount to anything on the iPad with its quick and easy tap-editing is anyone's guess.

Final question: what impact will all of this have on rugged tablets? After all, Apple will likely never make a rugged iPad, and although many rugged cases are available for iPads, more than likely Windows and Android will remain the prevalent OS platforms in rugged computing devices.

The answer, I think, is that anything relating to screen size is worth watching. Trends towards smaller and larger display sizes in certain classes of mobile devices have always been quests for strategic advantage as much or more than technological progress (try to sell a phone with a 3-inch screen today!). And with both Microsoft and Apple (let alone Samsung) now using pens, pens may well regain a much more prominent position in mobile devices.

In closing this article, let's not forget that we're still very much in the midst of a software interface quagmire. Most of today's productivity software was created for use with a mouse on a desktop. Yet, it's a mobile world now where touch has replaced the mouse. Unfortunately, while panning, tapping, pinching and zooming work great in media consumption apps, they lack the precision mouse-era productivity software still requires. And that, my friends, is where the word "pro" ought to come into play. It's not so much the size of the screen as it is how to work on it. And that issue remains to be resolved.

Posted by conradb212 at 4:52 PM

September 18, 2015

What led to the Universal Stylus Initiative

A short while ago I received a press release from the Universal Stylus Initiative. I filed that away in my mind, but got back to it because the concept certainly sounds interesting. Having used, described, tested and compared numerous pen and touch technologies over the past two decades in my work first at Pen Computing Magazine and then at RuggedPCReview, I definitely consider it a relevant and increasingly timely topic (witness Apple's announcement of the iPad Pro with a pen!).

So I spent some time thinking things through and figuring out the need for a universal stylus initiative.

The great appeal of tablets and smartphones, of course, is that they provide tremendous communication and computing capability in small and handy packages that can be taken anywhere. That's in part possible because they don't have physical keyboards that add weight and get in the way. The trade-off, of course, is that without a physical keyboard and a mouse it isn't as easy to enter a lot of data or easily operate the computer.

That's where touch and pens come in. Early tablets (yes, there were tablets in the early 1990s) were called pen computers because that's how they were operated, with active pens. There was touch, too, but it was primarily of the resistive variety that worked best with a pointy stylus. That technology saw its heydays when phones were still dumb and people kept their address books and calendars first on Apple Newtons and then Palms and Pocket PCs.

When Microsoft became interested again in tablets around 2001/2002 (I say "again" because they'd been interested a decade earlier, but primarily to fend off pen-based rivals to Windows) they built the "Tablet PC" around active pen technology. It's called "active" technology because a sensor board behind the LCD detects the precise position of the tip of the pen even when the pen does not actually touch the glass. That's different from "passive" touch technology where a touch is only registered when a finger or stylus touches or depresses the LCD surface.

What are the inherent advantages and disadvantages of active versus passive?

First, active pens make "hovering" possible. That makes it possible for a cursor to follow the pen without actually registering a touch. This way, the user knows where the tablet sees the pen. That allows for very precise operation, just like it is with seeing the cursor when one operates a mouse. Second, active pens can be pressure sensitive. That can be used for 3D-like operation, and is invaluable for artists and designers. Third, active pens can have very high resolution, which makes them quick and very precise, something that's increasingly important on today's super-high resolution displays. On the negative side, active pen technology is fairly expensive. It can be inconvenient to have to first locate the pen before the tablet is operational. And if the pen gets lost, the device may become unusable.

And what about the pros and cons of passive touch technology?

The good thing is that conventional resistive touch doesn't need a special pen. Any cheap stylus will do, as will a fingernail and even firm finger touch. Resistive touch is also fairly precise as long as it's used with a stylus, and it's totally immune to rain or any wetness. For that reason alone, many rugged tablets and handheld computers have been using resistive touch for many years, and are still using it. But passive resistive touch has some significant disadvantages as well. Finger touch alone is very imprecise and unsuitable for operating small user interface components such as scrollers, check boxes and the like. Even when using a passive stylus, there's no cursor to tell you where exactly the touch will be registered. And there's the issue of "palm rejection," i.e. making sure that the device only reacts to the stylus and not only to inadvertent contact via the palm of the user's hand.

The above was roughly the status quo until Apple popularized projected capacitive multi-touch with the iPhone. Procap, or p-cap, as it's commonly referred to, is still passive touch. But it's a far more refined and much more elegant type of passive touch. Instead of pushing down hard enough to register a "touch," p-cap works via "mutual capacitance," i.e. the decrease in capacitance between a sensor electrode and a drive electrode when a finger gets close enough to affect (syphon off, really) the normal capacitance between a pair. This technology only requires a very soft touch, and it's quite precise once a user gets the hang of it. It's also quick because it's electronic rather than physical, and p-cap can easily recognize more than one touch at the time. Apple took advantage of all of the advantages to allow the effortless tapping, panning, pinching and zooming that not only made the iPhone a game changer, but also made p-cap the touch interface of choice for virtually all tablets and handhelds.

However, even the wonderful p-cap technology has its disadvantages. First, the subtle change in capacitance between two electrodes when a finger touches it requires a dry surface. Water, with its great conductivity, tricks the electrodes into false readings. Second, since p-cap also doesn't facilitate "hovering" and the finger touch area is fairly large, p-cap operation isn't nearly as precise as that with a mouse or an active pen. Neither of those advantages was severe enough to keep p-cap from becoming the success it is. They were, however, the primary reason why some tablets and even phablets became available with active pens. And that even though the late Steve Jobs was adamantly opposed to pens.

There is, unfortunately, no getting around the fact that legacy Windows doesn't work well with p-cap. One result of that recognition was Microsoft's bafflingly unfortunate Windows 8 that imposed not-ready-for-primetime touch functionality to all the hundreds of millions or billions using legacy Windows software on the job. Another was that, Jobs' decree notwithstanding, tens of millions bought Samsung's Galaxy Note tablets that combined p-cap with a little Wacom pen, adding precision when needed, and also a handy tool to jot and draw and doodle.

How did all of this affect the industrial and vertical mobile computing markets we cover here at RuggedPCReview? In a number of ways.

While p-cap totally took over on consumer smartphones, it took years for rugged handhelds to switch from resistive touch to p-cap. That's for two reasons.

First, Microsoft simply didn't provide an upgrade path to Windows Mobile, Microsoft's mini-OS that had dominated industrial handhelds for many years. The p-cap-friendly Windows PhoneOS was for consumers, and so Windows Mobile, although at some point renamed Windows Embedded Handheld, became a dead end. With the result that while smartphones charged ahead, vendors of industrial handhelds were stuck with an increasingly obsolete OS. In the consumer smartphone market, Android quickly filled the void left by Microsoft, but industrial and vertical market customers were, and still are, much slower to adopt Android.

Second, vertical market customers often do wear gloves and they often have to work in the rain or where it gets wet and p-cap doesn't work well. Between these two reasons, staying with resistive touch and a passive stylus made sense.

The situation, interestingly, was different with tablets. While the capacitive touch-based iPad was a runaway success that was followed two or three years later with equally successful Android tablets that also use p-cap, Android had a much harder time in industrial and vertical markets. There were a good number of attempts at industrial and enterprise Android tablets, and some saw modest success. But on tablets the pull and advantages of remaining part of the established Windows infrastructure were much stronger, and Windows tablets saw, and see, remarkable success. To the extent where not only the majority of vertical market tablet vendors continue to offer Windows tablets and introduce new ones, but where Microsoft itself is heavily investing into its own Surface tablet hardware.

Which, of course, gets us right back to Windows' weakness with pens. Microsoft's very own tablets use active pens in addition to p-cap, in essence admitting that even when using Windows 10, finger tapping alone just won't get the job done.

So we're sort of back to the same predicament pens had a couple of decades ago. Extra cost, easy to lose, proprietary. You can use any pen or pencil to write on any piece of paper, but a Wacom pen will only work with a Wacom-based tablet, an nTrig pen needs an nTrig tablet, and so on. And none of those proprietary pens could be used on a regular p-cap phone or tablet.

And this, finally, gets me to the Universal Stylus Initiative (USI). USI is a non-profit that was formed in early 2015 specifically with the goal of creating a standard that would allow any active pen to work on any p-cap smartphone, tablet or notebook.

On September 10, 2015, USI announced that their membership had grown to 31 companies, more than doubling from the initial launch in April 2015. Initial membership included such touch/pen heavyweights as Wacom, Atmel, Synaptics, eGalax-eMPIA, Hanvon Pentech, Waltop, as well as electronics industry giants Intel, Sharp, and Lenovo. By now, Asus and LG joined, as well as stylus providers (KYE Systems, Primax, Solid Year Co, Montblanc Simplo), and touch controller providers like Parade Technologies, Silicon Integrated Systems, Raydium Semiconductor and STMicroelectronics International.

This immediately brought up the question as to why any vendor of active pen systems would want to have any part of such a thing. After all, these are technologies heavily covered and protected by iron-clad patents and unassailable intellectual property. And the market for (rather expensive) replacement pens is quite profitable.

A visit to USI's website (see here) answered a few questions but not all, as the specification and most of the background technical information, of course, is only available to USI members.

I was fortunate that USI Chairman Pete Mueller made himself available to a much appreciated call. Pete, whose daytime job is that of principal engineer and senior technologist at Intel, explained that while there is indeed extensive intellectual property in active pen technology, the major players in the field have long seen the potential strategic advantage of providing a degree of interactivity. Talks between them are not unusual as making active pen technology more universally desirable will likely result in a much larger market. (Note that earlier in 2015 Wacom announced a "Universal Pen Framework")

Think about it: there are by now billions of p-cap smartphones and tablets without active pens. Given the increasing call for productivity-enhancing (i.e. creation, analysis, design, etc.) rather than activity-draining (i.e. news, video, silly cat tricks, games) smartphone and tablet technology, the availability of universally compatible pens might be a massive boon. Unlike today where the only option for tablet users are those awful fat-tipped passive pens that are hardly more precise than fingers, a universal active pen could open up entirely new functionality for little or no extra cost.

Atmel's blog quotes Jon Peddie of Jon Peddie Research as saying, "To date the market has been limited by proprietary touch controller-stylus solutions, which limits OEM choices and cost reductions. With the USI specification released, we expect that the capacitive active stylus market will grow from 100 million units in 2015 to 300 million units in 2018, opening up new markets such as smartphones and all-in-one PCs (see quote here).

How does USI intend to make that possible? In their words, "the USI standard defines the communication method by which the stylus sends data about the stylus operation to the smart phone, tablet or notebook PC. The data includes information such as pressure levels, button presses or eraser operation. In addition, USI technology makes use of the existing touch sensor (via a technology called Mutual Capacitance) in smart phones, tablets and notebook PCs, so that the added cost for enabling USI technology on these devices is zero or minimal."

But if we're talking active pens working with capacitive touch controllers, how could those p-cap controllers possibly work with active pens? Pete couldn't go into details on that because much is non-disclosure material, but the general idea that I got was that using a "USI-enabled" pen on a "USI-enabled" device would provide some, but not all of the full functionality of a particular active pen technology.

What does that mean? A look at original USI member Waltop's website provides some clues. It says that they provide both USI-enabled and vendor-specific styli, and that both types "satisfy the performance requirements of Windows 10, such as accuracy, linearity, latency, hovering height, etc. So presumably the USI standard seeks to cover all the mandated basics of using an active pen on a p-cap touch screen, but there are still special capabilities, extra functionality and perhaps higher performance only available through the full proprietary active pen technology. To use one of my beloved automotive analogies, USI provides the road system that allows any street-certified vehicle to get from point A to point B, but if someone has special desires and requirements, they will still want to get a Lexus or a Porsche.

However, Atmel's blog says that Through the same sensor that one’s finger uses to command a device, the stylus communicates via different frequencies to perform the action of writing — writing with up to 2048 different levels of pressure to give the pen-on-paper experience and render thinner or thicker lines in note-taking, painting and doodling, just like an ink pen." That sounds more like some of the proprietary functionality of an active pen system is being brought over into USI-spec passive p-cap controllers.

Poking around the web a bit, it seems like USI systems will be able to differentiate between a USI mode and a proprietary mode, or even switch between the two, depending on which seems appropriate. USI pens apparently will be AAAA battery-powered allowing a slender size and a projected 12-month battery life.

As is, USI hopes to have version 1.0 of their specification done by the end of 2015, and after that we should start seeing active pens that work on any p-cap device that's also compliant with the USI spec. It should be interesting to see what will become available, how well it works, and whether the initiative will take. -- Conrad H. Blickenstorfer, September 2015

Related:

  • Website of the Universal Stylus Initiative
  • The Universal Stylus is Coming (Intel Free Press)
  • Universal stylus to bring easy digital inking to tablets (TechRadar)
  • Early example of a USI pen: Baton US-70

    Posted by conradb212 at 5:47 PM

    August 27, 2015

    Replacing the Atom N2600

    This morning I received an email from German touchscreen device developer and manufacturer faytech. The company prides itself in its ability to design and engineer high-quality products in Germany, then have them manufactured cost-efficiently in Asia, while providing local service.

    This email, though, wasn't about their latest touchscreen products. It was about the processors they use in their touchscreen panels and PCs. Specifically, it was about replacing the ubiquitous Intel Atom N2600 with the newer Intel Celeron N2807 and J1900. Faytech seemed especially taken with the N2807, which the company has chosen as the new standard for their resistive touchscreen portfolio. They said, "replacing the predecessor Intel Atom N2600, the new processor has it all: speed, stability, lower power consumption, and much better performance per watt ratio." The Celeron J1900, by the way, will be the new go-to chip for faytech's capacitive touch devices.

    That caught my attention. Are the N2807 and J1900 Celerons really the N2600's successor? And if so, why? As is, Intel is making so many different processors and has so many different classifications for them that even those following the industry closely often can't tell them apart or explain why one processor should be chosen over another.

    First, why does the N2600 need replacement? True, the "Cedarview" Atom N2600 was launched well over three years ago, an eternity in Intel's rapid-fire chip development cycle. But it turned out to be an exceptionally good chip.

    A third generation descendent of the Atom N270 that powered tens of millions of netbooks, the N2600 and its slightly faster N2800 sibling were the first Atom chips to use 32nm process technology instead of the older 45nm, making for smaller, more efficient packages. Cedarview processors were dual-core systems whereas before only desktop-oriented Atom versions had two cores. Graphics performance benefitted from a different design and much faster clock speed, resulting in Intel claims of 2X graphics performance compared to the second generation Atoms. And integrated hardware-accelerated video decoding enabled smooth full HD (up to 1080p) video playback, something that was not possible with earlier Atom chips.

    Word on the N2600's qualities got around, and a lot of manufacturers that had been burned by the poky performance of most original Atom chips switched to the N2600. When RuggedPCReview benchmarked a N2600-updated Handheld Algiz 7, we found an overall 3X performance improvement over the earlier Atom Z530-based version. Another example was Motion Computing's upgrade from an Atom Z670 in its original CL900 tablet to the N2600 in its CL910 successor. Again, the performance improvement was substantial (in the 70% range).

    We praised the N2600 as "probably the best general purpose Atom chip so far." N2600 performance was so good that some manufacturers who later upgraded to some of the lower-end Intel "Bay Trail" chips were in for a harsh surprise. For example, RuggedPCReview found a "Bay Trail" E3825-based tablet no faster than its N2600-powered predecessor.

    But that was in 2013, and it's now Fall 2015. The N2600's reign seems to come to an end, and Intel's "Bay Trail" platform is the reason.

    Bay Trail represents a departure from Intel's product strategy of the past few years that differentiated between low end (Atom) and high end (Core) processors. Instead, Bay Trail consists of a large family of single, dual, and quad processor chips optimized for various types of devices. Lower end Bay Trail processors use Intel's "Atom" brand, whereas higher end versions targeting tablets, notebooks and desktops carry Intel's "Celeron" and even "Pentium" brand names.

    Further, for the first time, an Intel Atom microprocessor architecture is paired with genuine Intel graphics. The graphics cores integrated into Bay Trail systems are of the same HD 4000 architecture and variety as those used in Intel's 3rd generation "Ivy Bridge" processors, albeit with fewer execution units (four instead of several times that number) and lower clock speeds. The new graphics support most of the same APIs and features, including DirectX 11, OpenGL 3.x (and even 4.0 in some cases), and OpenCL 1.2. Better yet, some clever power-saving features from 4th generation "Haswell" Core processors seemed to be included as well.

    So it's no surprise that Bay Trail has been a resounding hit. By and large, the impression on the street is that "Bay Trail" is much faster than all those old-style Intel Atom chips, fast enough to do some actual general purpose work, even tough assignments that just may come along on any given day. That includes the kind that totally brought almost every older Atom system to its knees. And it all comes at a cost that's a lot lower than full Intel Core processors. From Intel's vantage point, Bay Trail cut down on the complaints about Atom performance while, despite all the improvements, still being quite a ways behind full Core processor performance and thus no threat for that lucrative Intel market.

    The only problem is that it further increased confusion about Intel's various product lines. Bay Trail chips, while all using an Atom CPU architecture, are sold as Atoms, Celerons and Pentiums. Aren't Celerons gutless entry-level loss-leaders and Pentiums some ancient brand from a distant past? Not anymore, apparently. Or at least it's a different kind of entry level now. Figuring out the difference between all those Bay Trail chips isn't easy. And to make matters more confusing yet, some new Celerons aren't Bay Trail chips at all; they are Intel "Haswell" Core processors (like the Celeron 2980U).

    So what about the Celeron N2807 and J1900 that the good folks at faytech chose to replace the N2600 as the standard in their touch PCs and panels? Let's take a look at the two chips.

    Both of them are based on 22nm lithography instead of the older N2600's 32nm, both use the FCBGA1170 socket, and both use low-power DDR3L RAM. But that's where the similarity stops.

    The J1900, which Intel lists as an embedded desktop processor, is a quad-core chip with 2MB of L2 cache, running at a base frequency of 2GHz and a maximum burst frequency of 2.42GHz. Its thermal design power is 10 watts, it can support up to 8GB of RAM, its base graphics frequency is 688MHz with a top speed of 854MHz.

    The N2807 is listed as an embedded mobile processor. It is a dual-core chip with 1MB of L2 cache, running at a base frequency of 1.58GHz and a maximum burst frequency of 2.16GHz. Its thermal design power is 4.3 watts, it can support up to 4GB of RAM, its base graphics frequency is 313MHz with a top speed of 750MHz.

    For a more detailed spec comparison of the N2600, N2807 and J1900, check this page.

    In terms of raw performance, the J1900 would seem to have a clear advantage over the N2807, even though Intel actually lists the recommended price of the N2807 as higher than that of the J1900 (US$107 vs. US$82). Why is the slower N2807 more expensive? Likely because as a "mobile" chip it includes additional low power modes. It also includes a number of special Intel technologies that the J1900 doesn't have (like Intel Secure Key, Intel Idle States, Intel Smart Connect, and Intel Wireless Display). Unfortunately, even Intel's spec sheets only present an incomplete picture as the sheets are inconsistent. The technically minded will find some more info in the very technical Features and specifications for the Intel® Pentium and Celeron Processor N- and J- Series document.

    What about real world performance? Here we can present some hard data.

    According to RuggedPCReview's database, N2600-based devices scored an average of 476 in the PassMark 6.1 CPU Mark test, 433 in the overall PassMark suite, and 56,070 in the CrystalMark suite. J1900-based devices scored an average of 2,068 in the PassMark 6.1 CPU Mark test, 974 in the overall PassMark suite, and 117,000 in the CrystalMark suite. The sole N2807-based device scored 782 in the PassMark 6.1 CPU Mark test, 570 in the overall PassMark suite, and 83,800 in the CrystalMark suite. Based on these numbers, one might expect a N2807-based system to offer a roughly 1.5X overall performance increase over a N2600-based device, and a J1900-based system a roughly 2.2X overall performance increase over a N2600-based device. And a J1900 system might offer roughly 1.5X overall performance over a N2807-based device.

    So is replacing the venerable N2600 with either one of those Bay Trail chips a good idea? Yes. Time stands still for no one, not even for a good chip like the Atom N2600 of just three or so years ago. But we also believe that given the overall breathtaking pace of progress in the CPU arena, replacements should provide a clear and very noticeable best in performance, and not just an incremental one. So our money would be more on the Celeron J1900 which seems to have all the goods to be a solid, remarkably speedy, yet still economical go-to chip for any number of industrial computing projects where absolutely minimal power consumption is not the highest priority.

    There is, of course, a large number of other processor options from Intel and from other sources. But the x86 world likes standards and things to rely on, and so we've historically seen a flocking to a small number of chips that offer a good overall balance. Chips like the Celeron N2807 and J1900.

    Posted by conradb212 at 4:19 PM

    April 17, 2015

    Xplore Technologies acquires Motion -- How it came about

    Today I listened to the full Investor Call broadcast Xplore held on April 16 about its acquisition of Motion Computing, and a lot of things are clearer now (listen to it here).

    Motion didn't exactly choose to be acquired, and this was not one of these situations where a big company comes along and makes a financial offer just too sweet to resist. What happened was that Motion found itself in a financial bind caused by third party issues over which Motion had little to no influence over. Specifically, the supplier of the displays used in their Motion C5 and F5 semi-rugged tablets shut down its plants in South Korea without any notice to speak of. This left Motion, which uses a built-to-order model, high and dry and unable to fill C5 and F5 tablet orders, with those two products combining to about half of Motion's sales. With half of its sales essentially on hold, Motion's financial situation quickly went from being quite stable to critical, until their main lender foreclosed.

    This requires some background information. Motion, like most US electronics vendors relies on Asian OEMs (Original Equipment Manufacturers) and ODMs (Original Design Manufacturers) to make its products. There are various nuances to such agreements, but suffice it to say that those OEMs and ODMs likewise rely on their vendors to supply parts. One such part was screens from a company called Hydis. Unfortunately, Hydis' parent company, Taiwanese E Ink saw fit to close two of the Hydis LCD manufacturing plants in South Korea.

    Now one might assume it should be easy to source replacement screens for tablet products that, while quite successful, were not produced in the millions or even hundreds of thousands. It obviously can be done, but it's not easy. There's locating a suitable replacement, there's business arrangements, there's adapting and configuring and testing, all of which takes time, time which isn't available to a company doing build-to-order. Components are changed and re-sourced all the time, but always with proper notice and proper lead time. Apparently, that wasn't the case with E Ink's shutdown of the Hydis plants.

    A bit more about Hydis. They make screens that we here at RuggedPCReview have long considered the very best. Originally launched by Korean Hyundai as Hyundai Display Technology and then part of Hyundai's Hynis Semiconductor, Hydis started working on a wide-viewing angle technology called "fringe field switching" (FFS) in 1996. That came in response to Hitachi launching the IPS display technology, which also provides superior viewing angles. Hydis was so convinced of the merits of their FFS technology that they decided to pretty much bet the farm on FFS. That was understandable as FFS not only provided 180 degree viewing angles from all directions, but also offered terrific contrast ratio, none of the dreaded color shifts that lesser LCDs to this day display when viewed from certain angles, lower power consumption than IPS, and also no "pooling" upon touch.

    Hydis was spun off from Hyundai completely in 2001, just when Microsoft's Tablet PC effort got underway, and Hydis saw a big opportunity to be the dominant player in tablets. I recall that at Pen Computing Magazine we were blown away when we saw Hydis FFS displays in Sharp's Actius TN10W and HP/Compaq's TC1100 notebook convertibles in 2003. The Hydis displays were so much better than anything else that there simply was no comparison.

    Just when things appeared to look bright for Hydis, Hynis sold them off to Chinese LCD manufacturer BOE, and the company became BOE Hydis. Between the Tablet PC never living up to expectations and other issues, BOES Hydis didn't do well and was acquired by Taiwan's Prime View International (PVI) which eventually, in 2010, became E Ink, the folks who pretty much had cornered the market on those paper-like eBook reader displays used by the Kindle and also by Sony. PVI had actually managed to nurture Hydis back to financial health, but did so primarily by selling and licensing Hydis FFS patents. This led to Korean protests that after BEO, E Ink was also simply "asset-stripping" Hydis and thus Korean intellectual accomplishment. Add to that the fact that E Ink fell on hard times itself after the eBook market was severely impacted by the iPad and iPad-class tablets, and it's no surprise that they didn't have the resources to properly fund Hydis.

    That said, and much to Hydis' credit, the company did not rest on its FFS laurels. First came an improved FFS, and then, in 2007, AFFS+, which perfected the technology in numerous respects, including even better outdoor viewability.

    Motion, always on top of technological advances, was an early adopter of Hydis displays, giving them an edge over competition that used lesser LCDs not nearly as well suited for tablets where the ability to view the image on the display from any conceivable angle matters much more than in laptops. The superior quality of the Hydis AFFS+ displays used in Motion's C5 and F5 tablets contributed to their wide acceptance, and continued to do so even in the latest generation of the platform, launched in February 2015.

    Unfortunately, February 2015 was also the month where E Ink suddenly shut two Hydis plants in South Korea. The stated reason were "chronic losses" and "high manufacturing costs." That didn't sit well with the Koreans who felt that E Ink had let the plants become obsolete, on top of simply mining Hydis for its patents. The bottomline for Motion was that they had a very promising new generation of their C5/F5 platform based on Intel's latest 5th generation "Broadwell" chips, and no screens.

    No product, no sale, and the rest is history.

    Enter Xplore. Located within ten miles from one another, the two companies knew each other well. Some of the workforce had worked in both companies, and both certainly also benefitted from the presence of Dell, which although not having any deep business relationships with either Motion or Xplore, made for the presence in Austin of a steady pool of highly qualified technologists.

    But if Hydis displays were so good, and especially well suited for tablets, didn't Xplore use them as well? They did. Xplore, too, was an early Hydis adopter, and that move may well have been what helped Xplore survive and eventually thrive while some of its direct competition did not. Xplore's high-end, ultra-rugged iX104 XC6 tablet has a Hydis screen. So didn't the Hydis shutdown affect Xplore as well? It did, but Xplore had inventory and did not entirely rely on build-to-order. And while Hydis certainly has an impact on Xplore as well, their financial situation was different from Motion's and they were able to not only absorb the blow, but also turn it into an opportunity by taking over a very complementary Motion Computing.

    If there's ever been a better example of making lemonade from lemons, there haven't been many. Had Xplore not been there just ten miles away and a perfectly logical rescuer, Motion would have been liquidated and most likely totally ceased to exist. That would have been a massive blow to Motion's customers and employees, and also to the rugged computing industry in general. As is, Xplore says that "the vast majority" of Motion employees have been extended employment offers.

    So what can we expect from all this? As is, Xplore sales apparently peaked in 2012 with over $100 million, but still were in the 80 million range for 2014. Xplore is a roughly 40 million business. Xplore warns that this doesn't mean that they're suddenly a $140 million business, and that it'll take a year or two until everything has been integrated and ironed out. Xplore Chairman Philip Sassower likened the opportunity as being given the chance to pick up a mansion in distress in a $5 million neighborhood for three quarters of a million. It'll take work and perhaps half a million investment, but then it'll be a $5 million mansion again.

    And then there are the numbers. VDC estimates the 2015 market for rugged tablets as being worth about $585 million. And the 2015 market for rugged laptops about $1.2 billion. And that's on top of a massive number of consumer tablets, a portion of which go into the enterprise that would just love to have some more durable gear. So there's plenty of upside. Xplore is already working on getting suitable replacements for the Hydis screens. And Sassower wants for Xplore to be the #1 in rugged tablets. -- Conrad H. Blickenstorfer, April 17, 2015

    Posted by conradb212 at 7:46 PM

    Xplore acquires Motion -- what it means

    On April 16, 2015, Xplore Technologies and Motion Computing announced that Xplore was acquiring Motion. This was not a total surprise as both companies are in the rugged tablet computer market, both are pioneers in tablets, and both are located within ten miles from each other in Austin, Texas.

    And yet, the announcement came as a surprise to me. When I had interviewed Motion CEO Peter Poulin in February of this year, Poulin had ended with saying "Motion is in a good position. According to VDC, Motion is the #2 player in rugged tablets, more than twice as large as #3," and he followed up with saying the company had just totally revamped all of their platforms for much greater performance and enhanced wireless communication and ruggedness. And that they had other products in the pipeline. "We're quite optimistic," Poulin concluded. And yet, just a couple of months later, Motion was acquired.

    The move was also a surprise because both Xplore and Motion have shown remarkable resilience from setbacks and challenges throughout their existence. In an era where numerous rugged computing gear manufacturers either folded or were absorbed by either Motorola or Honeywell, Xplore and Motion persevered and remained independent.

    As a privately held company, Motion's business fortunes were closely guarded, but the company's savvy, skills and determination were apparent throughout its history, starting with the unenviable task of taking Microsoft's flawed 2001/2002 Tablet PC initiative and running with it. Though highly publicized and initially supported by most major PC manufacturers, the Tablet PC didn't find widespread acceptance due to high costs and technology that just wasn't quite ready yet. Yet, Motion toughed it out and established for itself a nice niche in enterprise and vertical markets.

    Both Xplore and Motion were especially skillful in recognizing valuable technology advancements early on, and quickly making them available to their customers. Both companies were pioneers in such productivity-enhancing features as dual input where pen and touch worked in harmonious unison, superior outdoor-viewable displays, ergonomics suitable for actual tasks at hand, and the ability of their products to not only hold up in challenging daily use, but also perform at full speed under any operating conditions.

    On the Motion side, the company's early adoption of Intel's Mobile Clinical Assistant (MCA) platform was an impressive example of their unerring compass of what worked and made sense, and what didn't. Motion's C5 MCA -- with its square layout, integrated carry handle, and peripherals placed in the exact right spots -- became a big success, so much so that Motion added an F5 version of the platform for general enterprise and industrial use. Most impressively, while a good dozen other companies also introduced Intel MCA-based tablets, most quickly abandoned them again, lacking Motion's razor-sharp focus on their markets and tablet products.

    Fellow Austin resident Xplore impressed through sheer determination. Time and time again Xplore found new investment as the company's leadership tirelessly presented its case. Which wasn't always easy with what for a long time essentially was a one platform product lineup.

    I well recall first seeing them at a Comdex trade show in Las Vegas in the late 1990s where they had a large, terrific display, a convincing message, and jaw-dropping prototypes that, however, were not quite final yet. That was quite common back in those days, and most of the attempts led nowhere. But Xplore was back the next year, and the year after that.

    When we published the Pen Computing print magazine and did annual Editor's Choice and best product awards, Xplore scored with its impressive GeneSys Maximus. I remember calling Xplore with the good news, and they were disappointed that it was the GeneSys that got the recognition, and not their then semi-secret brand-new iX104. Little did we know that that machine was to become the core engine of Xplore's success and future.

    So why Xplore and Motion got together now, after all those years, I don't know. Business imperatives, I assume, and I am sure it makes perfect sense. But what does it mean looking forward, especially in the light of many such acquisitions that did not work out for the best? In the past we've seen large companies almost mindlessly snapping up much smaller ones. That's not the case here. We've seen fierce competition where one competitor eventually came out on top and annihilated the other. That's not the case here either. So let's see what Xplore and Motion bring to the table.

    Historically, Xplore has been tending to the ultra-rugged tablet market whereas Motion concentrated on a variety of vertical markets that required durable, specially designed and configured tablets. Motion does not have anything that competes with the various versions of Xplore's ultra-rugged iX104 tablets (see here). Xplore doesn't have anything like Motion's C5 and F5 semi-rugged tablets with their integrated handles. Xplore also doesn't have anything like Motion's R12 tablet with its big 12.5-inch screen (see here). So there's no overlap there. And Motion doesn't have anything Android-based, whereas Xplore has its modern, innovative RangerX tablet.

    There is a degree of overlap in just one area, and that's in the promising and potentially quite lucrative area of compact lightweight Windows tablets. That's the tablets for users who do need Windows, but want it in a trendy, sleek and attractive iPad-like design that's tough enough to hold up on the job. For that Xplore has their Bobcat (see here) and Motion has its CL920 (see here). These two, though, are also different enough to be able to co-exist in the short term, the CL920 the unified company's enterprise market tablet entry and the Bobcat for tougher assignments that require more armature and a higher level of sealing.

    Most importantly, there is very little existing customer overlap with these two companies. Xplore has traditionally concentrated on oil & gas, military, government, heavy industry, and similar, whereas Motion is primarily active in healthcare, retail, construction, field service, and so on. If Xplore plays its cards right, it can emerge as a much larger company with a much broader reach, and also perhaps as an example of where 1 + 1, for once, adds up to more than 2. I've said it ever since the consumer tablet boom began, and I'll say it again: with the tablet form factor fully accepted virtually everywhere, there's tremendous opportunity for rugged equipment vendors to step in and successfully provide this desired and contemporary form factor in products that do not break on the job and in the field.

    Overall, this development may also be good news for other independents in the rugged tablet market, companies like Getac, GammaTech, MobileDemand, the Handheld Group, and others: resistance is not futile. Keeping it in the family and preserving the unique, special expertise of the rugged computing industry may well be the best way to success and prosperity.

    -- Conrad H. Blickenstorfer, 4/16/2015

    Posted by conradb212 at 12:05 AM

    February 10, 2015

    Conversation with Peter Poulin, CEO Motion Computing

    On February 5th I had a chance to speak with Peter Poulin, who was appointed Motion Computing's CEO on December 11, 2014. An industry veteran with more than 25 years of sales, marketing and general management experience in the public and private sectors, the company's press release said Poulin's goal will be to capitalize on the company’s deep mobility expertise and aggressive investments in the design and development of ruggedized tablet platforms and integrated mobility solutions to expand its reach within target vertical markets.

    Over the years, I've been covering Motion's vertical market tablet lineup in some detail, going back to a meeting in San Francisco in 2001 where Motion CEO, Scott Eckert, and founder, David Altounian showed me the prototype of their first tablet. Motion was a startup then, formed to be part of Microsoft's Tablet PC initiative.

    While the overall Tablet PC project was not as successful as Microsoft had hoped, and it would be almost another decade before the iPad finally put tablets on the map, Motion established itself as a provider of enterprise and ruggedized tablets in various niche markets. Motion succeeded where many others failed with their early tablet efforts due to focusing on tablets and tablets alone (Microsoft itself had flip-flopped during the Tablet PC gestation period, switching emphasis from pure tablets to convertible notebooks), and also by displaying an unerring ability to recognize new trends and technologies at an early stage and making them available to their customers.

    One look at Poulin's resume shows that he's uniquely qualified for the job as Motion's CEO. An electrical engineer with a degree from Cornell, Poulin worked for Compaq for 13 years in sales, marketing and management. He then broadened his expertise and horizons with sales and business development stints at such diverse technology companies as Segway, NetBotz, APC, internet solutions providers Hoovers and Virtual Bridges. Poulin joined Motion Computing in July 2012 as VP of marketing and then ascended to CEO.

    Here are some of the highlights of our discussion:

    RuggedPCReview: Microsoft's Tablet PC initiative wasn't a great success and most early tablet providers exited the market within just a couple of years. Except Motion. What made Motion successful with early Microsoft-based tablets where others failed?

    Poulin: The answer is, it's not just about the tablet. It’s really about understanding the customers’ workflow, and integrating the technologies that enable that workflow, of which the tablet is one component. Motion decided early on to focus on a limited number of verticals. In the process we gained a great amount of expertise on how customers use technology. I believe what differentiates Motion is that we have very purpose-built devices that truly make the job easier. An example is the unique keyboard pairing we use with our R12 tablet. It's super-easy and there's none of the frustration users often have with Bluetooth pairing sequences. We know how field service workers work, we know how to build docks that work for them, peripherals that work for them, features that they need. Yes, we seek to grow as a company, but we are careful not to lose that depth and connection to our customers and spread ourselves too thin.

    RuggedPCReview: Ever since the introduction of the iPad, tablets have become a huge success. But the success is primarily in consumer markets and to some extent in the enterprise. Why is that?

    Poulin: We see the tablet as a very successful enterprise tool, and we have the mass consumer adoption of the tablet to thank. However, the consumer and the enterprise have very different needs. For many enterprises and vertical market businesses it's a matter of how to reduce deployment risks. They want to know how they can protect their investment. They need to leverage existing investment. They need to reduce downtime. They need to focus on user work flows. And they know that if their users don't embrace a technology it just won't work. One of our customers, Thames Water, engaged in extensive user testing (200 users) before making a decision. Our Motion F5 was chosen as the clear favorite in user testing with 86% preferring the device over two competing alternatives. Our tablets are replacing a fleet of legacy handhelds to run Thames Water’s SAP and ClickSoftware asset and field management systems. User testing and user acceptance were key elements in Thames decision to choose Motion.

    RuggedPCReview: Over the years, Motion has generally been a pioneer in quickly making new and better technologies available to users. Examples are superior displays, input technologies, the latest processors, new form factors, etc. Is this art of Motion's corporate culture?

    Poulin: Motion has always had a lot of excellent tech people. We have the discipline of big corporation experience, complemented by the agility of startup experience, and that helps us moving fast, being first, being innovative. This has undoubtedly shaped Motion's culture. But I believe we also have a great balance between technical and customer experience. While the tech innovations are most visible, we're also constantly working on details, peripherals, modules, and how to easily make them part of our tablets. That takes a lot of risk out of integration, and our customers appreciate that.

    RuggedPCReview: We currently have this interesting situation where Apple and Android-based tablets almost completely dominate the consumer markets, whereas Microsoft remains strong in tablets designed for enterprise and vertical markets. For now, all of Motion's tablets use Windows. How do you see the Windows versus Android situation?

    Poulin: We watch that situation very, very carefully. I think one difference between consumer and vertical markets is that on the vertical side it all depends on application software, the kind that does all the heavy-duty lifting, and almost all of that runs on Microsoft. Are verticals adopting Android? Yes, to some extent. Some of our customers are trying Android with very narrow apps for certain very specific tasks. The challenge for Android comes with heavier duty apps, development and maintenance cost, and the fact that, for now at least, Android changes so very quickly and older versions are no longer supported. For IT organizations, that cadence of change is painful.

    RuggedPCReview: Microsoft is putting a heavier emphasis on cloud services. Where do you stand on that?

    Poulin: Given the ubiquity and ever-increasing performance and reliability of broadband connections, Motion is paying a lot of attention to cloud-based applications and services. Along with that, security is becoming an ever-greater concern, both in the cloud and also with broadly distributed devices. Motion has long considered security as an integral part of our products and services with TPM, Computrace, multi-factor authentication, etc. In our newly-released F5m and C5m tablets, we're stepping security up by another level with self-encrypting drives.

    RuggedPCReview: While Microsoft certainly still has a huge edge in enterprise and vertical market deployments, there are also challenges as Microsoft attempts to integrate mobile markets into its OS strategy.

    Poulin: Yes, there's certainly a challenge with Windows 8 and 8.1, but overall they're getting bashed a bit too much. Microsoft hasn't done bad, and things are only getting better now. Microsoft is just so eager to get it right that perhaps they moved to catering to consumers a bit too fast, and that can be very disruptive to the enterprise. Then there are the migration issues. Windows 7, Windows 8, Windows 8.1, and soon Windows 10, and they need to support everything. It's not easy to make an OS attractive to consumers as well as corporate customers.

    RuggedPCReview: On the hardware side, Intel has been charging ahead at a very rapid pace with successive generations of Core processors. How difficult and important is it to keep up with Intel?

    Poulin: It's not that complicated on the high end, because the performance levels are there, and have been there for a while. Motion customers do not always want such a rapid pace, so sometimes they skip a generation, and sometimes it's tempting to skip two. It's not so complicated at the low end where it took Intel a while to get up to speed with the Atom platform. That was a bit tough for a while, but they're now sorting that out, and Motion is very confident in the range and predictability of Intel’s product roadmap.

    RuggedPCReview: We can't help but notice that Austin, Texas, seems to be a hotbed for tech development and rugged systems. Dell is there, of course, and Motion, and also Xplore. What makes Austin special?

    Poulin: There's lots of talent in the Austin area. There are lots of big companies and also a vibrant startup community. Somehow it all came together.

    RuggedPCReview: Where, overall, does Motion stand now, and what are the plans for 2015 and beyond?

    Poulin: Motion is in a good position. According to VDC, Motion is the #2 player in rugged tablets, more than twice as large as #3. And we've just totally revamped all of our platforms, the CL-Series, the C5 and F5 Series, and the R12. All have much greater performance, and we also enhanced wireless communication and ruggedness. And we have other products in the pipeline. So we're quite optimistic.

    -- Conrad H. Blickenstorfer, Editor-in-Chief, RuggedPCReview

    Posted by conradb212 at 4:34 PM

    December 21, 2014

    Storage wars

    Anyone who's ever watched the "Storage Wars" reality TV series on the A&E Network knows that with storage, you never know what you're going to get. That's true for stuff people stow away in storage units, and it's also increasingly true with the kind of storage in our electronic devices.

    There was a time when RAM was RAM and disk was disk, and for the most part the only rule was that more was better. But that was in the era when you could count the total number of available Intel processors on the fingers of a hand or two instead of the roughly one thousand they offer now.

    These days things are much more complicated. And just as often no one seems to quite know for sure why a certain processor was chosen for a particular product, there aren't easy answers why a product uses this type of storage versus that. There often appears to be a disconnected between the engineers that make those often arcane decisions and the marketing side of things that must explain what it's for and why it's better.

    Here are a couple of examples we've recently come across:

    DDR3L vs. LPDDR3 — In mobile computing devices designed to draw as little power as possible you generally not only find "low power" processors (where low power refers to the low amount of electricity use, not the chip's performance), but also DDR3L random access memory. In the case of DDR3L, the L stands for a lower voltage version of the DDR3 memory standard, and more specifically 1.35 Volts instead of the standard 1.5 Volts. That kind of makes sense.

    However, you now increasingly also see devices with LPDDR3 RAM, especially in handhelds and tablets. LPDDR3, it seems, was created specifically for such devices, and their need to be able to go into standby modes where the memory uses as little power as possible. LPDDR3 runs on 1.2 Volts, is part of the main board, and generally uses only about a tenth as much power while in standby as does regular DDR3 RAM.

    Initially I thought LPDDR3 was a lower cost type of memory as we first saw it in lower cost units. But apparently its cost is actually higher than that of standard of even low power DDR3. And it's used in high-end devices such as the Apple MacBook Air.

    SSD vs. eMMC — We've long been seeing substantial differences in benchmark performance between different types of solid state storage. There's generally not much of a difference between the performance of different rotating media, a large difference between solid state storage and rotating media (solid state is much faster), and a very large difference between different types of solid state storage.

    Recently we observed an almost 10:1 difference between the SSD and eMMC mass storage performance in two tablets of roughly the same size, one low end and one higher end. SSD stands for Solid State Disk, eMMC for embedded Multi Media Card. The generic difference between the two, apart from a great variance of flash memory speed itself, is the sophistication and complexity of the controller and interface. eMMC have a basic controller and a relatively slow interface, whereas SSDs use complex controllers and one of the various SATA interfaces.

    There's always an inherent answer to any tech question, it's just that those answers aren't easy to come by. What I'd like to see more of is how something works, what its pros and cons are, and why it's used in certain products.

    Posted by conradb212 at 4:28 PM

    November 4, 2014

    Intrinsically safe ecom Tab-Ex: another rugged tablet based on Samsung hardware

    This morning I saw in the news that at the IFA Berlin show, ecom instruments launched what the press release called the "world's first Zone 1/Div. 1 tablet computer". New tablets are launched all the time, but I quickly realized that this was relevant for two reasons:

    First, the Zone 1/Div. 1 designation means it's a tablet for use in hazardous locations. Zone 1, in the IEC/ATEX classification system that handles intrinsic safety issues, means the devices can safely be used in areas where there are flammable gasses are likely present. In essence, that requires that there's no chance that the device can generate sparks or other means that could lead to explosions. I'd need to look up what exactly Div. 1 refers to; there are two different entities handling this classifications, the North American National Electric Code (NEC) or the European ATEX directive.

    Intrinsically safe devices, i.e. devices that are incapable of igniting gasses, dust or vapors, are very important in certain deployments, and so this new em tablet will certainly attract attention.

    The second reason why this new em ecom Tab-Ex tablet is relevant is that it's another example of a stock consumer Samsung tablet inside a specially developed case. In May 2014 we took a detailed look at the N4 device from Two Technologies, which is a Samsung Galaxy Note II with a value-added rugged case that also included a second battery and a physical keyboard (see our review here). But whereas the Galaxy Note II is a 5.55-inch "phablet," the ecom tablet is based on the Samsung Galaxy Tab Active with a much larger 8-inch screen. And the Tab Active offers pretty impressive ruggedness specs even without any third party enhancements: It's IP67-sealed, it can handle four-foot drops, and its 400-nits backlight is bright enough to handle outdoor viewing. The Tab Active is a US$700 tablet and you can see its full specs here.

    ecom isn't hiding the fact that their Tab-Ex is based on a Samsung tablet. Even the press release openly states that this type of solution "provides compatibility and a wide range of preloaded applications for a safer work environment, including unparalleled security functions like device encryption, MDM, VPn and secure connectivity (Samsung knOX)." And, perhaps even more importantly, that "being able to offer the same internal tablet, Samsung GAlAXY tab Active, ecom provides a key benefit of consistency in product use - whether Rugged, Zone 2 / Div. 2, Div. 1 or Zone 1. And, the software applications you develop for the Samsung Galaxy Tab Active will work unchanged on the Tab-EX01."

    All of this makes the em Tab-Ex another interesting case in the current discussion on where the rugged computing industry should be moving to best take advantage of the worldwide popularity, and impressively high-tech, of consumer smartphones and tablets. As is, there are vehemently different opinions in the industry. Some feel that it makes perfect sense to pack readily available consumer technology into a value-added case whereas others feel that the guts of a rugged device had to be just as rugged, and the rugged market was inherently incompatible with the 6-month product/technology cycle in the consumer market.

    Below you can see the ecom TabX on the left, and the donor Samsung Tab Active on the right.

    And here are the ecom Tab-Ex product page and the Tab-Ex brochure.

    Posted by conradb212 at 3:20 PM

    September 29, 2014

    GoPro 4 -- the GoPro phenomenon, how it came about, and why it matters

    On September 29, 2014, GoPro announced the GoPro Hero 4 and also a new basic GoPro camera. This is relevant for anyone dealing with ruggedized computing gear and using technology in the field for a number of reasons. But first, what exactly is GoPro and why do their products matter?

    GoPro makes tiny little action cameras that, somehow, captured the public's imagination and are now found seemingly everywhere. You could almost say they've become a cultural phenomenon. GoPros are mounted on skydiver and motorcycle helmets, on race cars, on skateboards, on boats, on planes and drones, and seemingly on everything else, sometimes several of them. Now you can even mount them on dogs.

    Why did this happen? GoPro cameras aren't particularly attractive — just featureless little boxes — and for several years they weren't very well known either. Initial markets were pretty much limited to what founder Nick Woodman had made the little camera for, surfers. Lore has it that since every surfer wants to be noticed and "go pro," that's where the name GoPro came from. Looks didn't matter as long as the camera could be mounted on a surfer or on a surf board, and as long as it could capture their exploits. Initial GoPros weren't particularly exceptional. Even by 2005, when increasingly capable digital cameras had long been available, the first GoPro was just a cheap Chinese-made plastic film camera. Digital video pioneered in the GoPro in 2006 in a little camera that looked remarkably what GoPros still look tray.

    That, of course, was the era of the Flip camera with its USB port. Flip was wildly successful for a couple of years, but was then, for inexplicable reasons, bought by Cisco of all companies, which did not seem to have any interest in it and simply shut Flip down, apparently seeing no market for small inexpensive vidcams.

    But two things were happening that made all the difference for fledgling GoPro. The first, of course, was the demise of Flip, which left lots of people wanting for a small handy action cam. The second was the technology convergence of immensely powerful new compression technology and inexpensive small storage media. Between those two technologies, it was suddenly possible to record glorious high definition video without the need for a big, bulky tape drive.

    It's probably fair to say that without a company by the name of Ambarella, the GoPro revolution might never have taken place. That's because Ambarella provides the chips and intellectual property to process and compress high definition video so that hours of it can be recorded on inexpensive little SD and microSD storage cards.

    So with a market pining for affordable, yet high-performance action cameras and Nick Woodman having the foresight to pack it all into his mighty little video cubes, the GoPro phenomenon was born. And that despite not even having been the first. RuggedPCReview editors reviewed the Liquid Image VideoMask and the Contour HD 1080p camcorder before, not being surfers, we even knew of the early GoPros.

    Once we did get a hold of a GoPro Hero we published a detailed review entitled "GoPro Hero — the GoPro phenomenon: what the world-beating little 1080p vidcam can (and cannot) do" where we analyzed both the GoPro hardware and its performance in great detail. We were quite impressed with the "terrific HD video" and "stunning value" of the GoPro, but took major issue with GoPro's underwater housing that GoPro claimed was good to depths of 180 feet. The housing probably was, but its curved dome lens kept the camera from focusing underwater. Baffled we wrote in caps "YOU CANNOT USE THE CAMERA UNDERWATER BECAUSE IT WON'T FOCUS?!!?" Baffled was too mild a word, as we had used GoPros to record majestic giant Mantas hundreds of miles off the coast of Mexico in a once-in-a-lifetime trip, and the GoPro video was all blurry.

    We reported in even greater detail on the GoPro Hero2, which still couldn't focus underwater. Apparently, the GoPro people were great surfers, but definitely not divers. The GoPro documentation now said "Please note that due to the curved lens of the HD HERO2 (and the original HD HERO) waterproof housing you will notice a slight loss of sharpness with underwater images." To which we responded in our report: "That is not good enough. It is NOT a slight loss. It makes the camera unusable underwater."

    Our reviews, using third-party underwater housings that DID allow sharp focus, attracted GoPro's attention and we subsequently helped them with testing an underwater housing that worked and also underwater filters that addressed the early GoPros inability to adjust white balance to underwater conditions. Below is a brief video of the fun I had with a GoPro Hero2 and a pride of young sea lions off Coronado island in Mexico.

    The GoPro Hero3 premiered October 2012 with an even better and less bulky underwater housing. The new line, whose body was even more compact than that of the Hero 1 and 2 and used tiny microSD cards, included a basic "White" model that performed at the Hero 1 level, a "Silver" Edition that was like an updated Hero2, and then there was the top-of-the-line "Black" Edition with a powerful Ambarella A7 chip that allowed recording 1080p video at 60 frames per second and, for the first time ever pretty much anywhere, 4K video, albeit only at 15 frames per second. The "Black" Edition could also shoot stills and shoot video simultaneously, and it had selectable white balance settings. We tested these new cameras both above and underwater and also shot many hours of underwater 3D video with GoPro's ingenious system of linking two cameras together.

    About a year later came the Hero3+, a slightly smaller, fine-tuned version of the Hero3 that efficiently addressed a number of Hero3 omissions and shortcomings, such as better audio, close-ups, better low-light performance, etc.

    And now, GoPro announced the Hero4, that includes the top-of-the-line Hero4 Black (US$499) and the Hero4 Silver (US$399), as well as a new basic camera, just called Hero (US$129). The most important features in the new models is that the Hero4 Black can do 4k video at a full 30 frames per second (and also 2.7k/50fps and 1080p/120). This means the camera is now capable of shooting usable 4k video and also high quality slow motion HD video. That could be a major boon to buyers of 4k HDTVs who find out that there really isn't any 4k content. The Hero4 Silver has roughly the same capabilities as the old Hero 3 Black but includes, a first for GoPros, an integrated LCD — prior GoPros could accommodate an optional snap-on LCD, but never came with an integrated one. The new US$129 entry-level Hero can do 1080p/30fps and 720p/60fps and is built directly into a waterproof housing.

    What does that all mean? Overall, with the Hero4 GoPro addresses the criticism that its Hero3/3+ could only do 4k video at a technology demonstration level, as 15 frames per second is essentially useless. The new "Black" can do real 4k video, and the high res slow motion modes are very useful. The Hero4 will likely not only help GoPro to remain wildly successful, but also cement its reputation of being at the bleeding edge of technology. The new "Silver" camera takes roughly the place of the older 3/3+ Black, and with the new basic US$129 Hero, the company has an inexpensive, invulnerable and totally waterproof camera.

    How does it all work? We don't know yet. We've had ample experience with numerous Hero 1, 2 and 3 models, but haven't been able to get any response from GoPro as of late. GoPro is now a public company whose stock premiered at US$24 a share on June 25, and has reached US$90 a share just three months later the morning of the Hero4 release.

    What has made GoPro such a success? Like all phenomena it's a number of things that all came together at the right time. GoPros aren't pretty and they are difficult to use and their battery life is marginal, but they get the job done and then some. No one else thought of offering all the numerous (and inexpensive) mounting hardware that allows GoPros to go along for the ride virtually anywhere. No one else pushed recording of extreme sports as much as GoPro, and all using their signature ultra-wide angle. No other camera fueled the imagination of more videographers that used GoPros everywhere to show new angles and record what could not be recorded before. No one saw the potential of Ambarella's ground-breaking compression technology and inexpensive tiny storage cards as clearly as GoPro. And so here we are.

    What does it mean for users of rugged computing gear? It means that stunningly powerful high-speed, high-definition recording capabilities are available inexpensively and in a form uniquely suited for field operations. In their cases that come standard with every GoPro, the little cameras are virtually indestructible and they can be used and mounted anywhere. Their recording performance is far above that of any camera integrated into any rugged handheld or tablet, and also above that of virtually all smartphones (and without the fear of breaking an expensive smartphone).

    I think it's fair to say that virtually any job out there that benefits from using a rugged notebook, tablet or handheld would also benefit from bringing along a GoPro, or a few of them.

    See GoPro's Hero4 press release

    Posted by conradb212 at 6:14 PM

    September 16, 2014

    The unpredictable nature of screen sizes

    It's a mad, mad, mad world as far as the screen size of mobile devices goes. Witness...

    For smartphones, 4.7 inches or so now seems the least customers will accept, and 5.5 inches or larger is better. When Apple introduced its iPhone 6 (4.7 inch) and iPhone 6+ (5.5 inch), the demand was such that both Apple's and AT&T's websites couldn't keep up. The AT&T website, in particular, was so messed up from all the pre-orders of giant phones that almost a week after I tried to place my own order, I still have no clue whether the order went through or not.

    Dial back a couple of decades to the dawn of handhelds. The first Apple Newtons and Tandy/Casio Zoomers and such all had displays in the 5-inch range, and all were considered much too big and heavy. Which, of course, they were. So the Palms and Pocket PCs that followed weighed less and had much smaller screens. The standard screen size for an early Microsoft CE-powered Pocket PC, for example, was 3.8 inches, and that was quickly replaced by the 3.5-inch format. Phones weren't smart at that time, and when they did start getting some smarts screens got progressively smaller as the quest was for smaller and smaller phones.

    This drive for the tiniest phones possible had an impact on industrial handhelds, with many switching to 3.5, 3.2 and even 2.8-inch screens, much too small for most serious work.

    What happened when the iPhone and then Android phones came along was that phones stopped being just phones and became computers with very small screens. The screen size issue was first addressed with "apps," software specifically designed for tiny screens, and then, increasingly, with larger screens as more and more customers wanted to do "real" computer work on phones. The lure and tangible benefits of larger screens outweighed the inconvenience of having larger and larger phones, and so now we have phones with screens measuring almost six inches diagonally. Obviously, it's not a trend that can go on.

    Interestingly, tablets and laptops followed different trajectories.

    Laptops initially had very small screens, because the industry didn't know yet how to make larger LCDs. Once that technological hurdle was solved, screens became ever larger, with laptop screens growing to 17 inches. That meant size and bulk and weight and cost. Many attempts at smallish, lightweight "boutique" laptops failed due to cost, until netbooks arrived. Despite their tiny screens they sold in the tens of millions, primarily based on their very low cost. The low cost, unfortunately, also meant low performance, and so customers demanded more speed and larger screens. The industry complied, but once netbooks were large and powerful enough for real work, they cost and weighed more and customers abruptly stopped buying and abandoned the netbook market in favor of tablets or, more specifically, the iPad.

    Interestingly, despite the decreasing costs of large screens, laptops all of a sudden became smaller again. Apple dropped their 17-inch models and is devoting all energy on super light 11 to 13 inch models. And it's rare to see a ruggedized laptop with a screen larger than 15 inches, with most being in the 12 to 14 inch range.

    With tablets, customers don't seem to know what they want. Historically, the first attempt at tablets and pen computers back in the early 90s was all 8 to 10 inch screens, primarily because there weren't any larger displays available. When Microsoft reinvented the tablet with their Tablet PC initiative in 2001/2002, almost all available products were 10 inches, with the 12-inch Toshiba Portege 3500 being the sole deviation, making it look huge. None of the Tablet PC era tablets and convertibles were successful in the consumer markets, though, and that lack of interest didn't change until the iPad arrived.

    The iPad set the standard for a "full-size" tablet with its 9.7-inch display that seemed neither too large nor too small, though many didn't know what to make of Apple sticking with the traditional 4:3 aspect ratio at a time where every display was going "wide." The aspect ratio issue hasn't been resolved as of yet, and screen sizes remain an issue. Initially being unable to get any marketshare from Apple, competitors tried another way by going smaller and cheaper, and suddenly everyone predicted 7-inch tablets were the sweet spot. And the 7-inch class was successful enough to get Apple to issue the iPad mini which, however, never sold nearly as well as the larger original.

    With tablets getting smaller and smartphones larger, the industry came up with the awkward "phablet" moniker, with "phablets" being devices larger than phones but smaller than conventional tablets. But now that phone displays are 5.5 inches and larger, "phablets" have become "phones," and the survival of 7-inch tablets seems in doubt, unless people suddenly decide that a 7-inch "phone" is desirable, in which case the old 1992 EO440 would look prescient rather than absurd.

    As is, no one knows what'll come next in terms of screen size. Even years after the first iPad, tablets with screens larger than 10-inches have not caught on. Phones as we know them simply cannot get much larger or else they won't fit into any pocket. And the size of traditional notebooks will always be linked to the size of a traditional keyboard, the operating system used, battery life, and the cost of it all.

    It should be interesting to see how things develop. And that's not even getting into display resolutions where some phones now exceed the pixel count of giant HD TVs

    Posted by conradb212 at 4:57 PM

    August 22, 2014

    Why OneNote for Android with handwriting is important

    A few days ago, the Office 365 and OneNote blogs at Microsoft announced OneNote for Android or the addition of handwriting and inking support for OneNote for Android, it wasn't quite clear (see here). While Microsoft OneNote isn't nearly as popular as Word and Excel, it's available as part of Microsoft Office, and supposedly over a billion people use Office. So there may be tens or even hundreds of million who use OneNote.

    What is OneNote? It's sort of a free form doodle pad that can accommodate all sorts of data, from screen clippings to text, to audio, annotations, revisions, doodles, and so on. It can then all be sorted and shared with others. If that sounds a lot like the older Microsoft Journal, there are plenty of similarities.

    Journal goes way, way back to the mid-1990s when a company name aha! introduced it as the aha! InkWriter. The idea behind InkWriter was that that you could arrange and edit handwritten text just like you can manipulate and edit text in a word processor, hence the name InkWriter ink processor. Handwriting and handwriting recognition were big back then, the central concept around which the initial pen computers of the early 1990s were built. As a result, InkWriter could also convert handwriting into text for later editing and polishing. To see how InkWriter worked, see the full manual here and Pen Computing Magazine's 1995 review here.

    Anyway, Microsoft thought enough of it to buy InkWriter and make it available in Windows CE and Pocket PCs for many years. Renamed Journal it's still available in most PCs. In many ways, OneNote looks like a enterprise and business version of Journal, one that's more practical and more powerful. To this day, some users prefer Journal while others prefer OneNote.

    The question may come up why Journal didn't become a major hit once tablets really caught on. Aha! designed InkWriter specifically for pen computers, i.e. tablets. There was, however, one major difference between the tablets of the 1990s and the modern media tablet: all 90's era pen computers had an active digitizer pen well suited for handwriting and drawing. Though no longer centered around handwriting recognition, Microsoft's Tablet PC initiative of 2001/2002 continued reliance on an active pen, enabling super-smooth inking and beautiful calligraphy.

    That still isn't enough to make the Tablet PC a success and it took several more years until the iPad showed that effortless tapping and panning and pinching on a capacitive multi-touch screen was the way to go. The tablet floodgates opened, and the rest is history. Unfortunately, with all the success of modern-era touch screen tablets, one simply cannot annotate a document with fingers, one can't write with fingers, and there's a good reason why fingerprinting is for Kindergarteners and not for serious artists.

    Samsung realized this when it equipped its rather successful Galaxy Note phablets with miniature Wacom-style pens. But since the Wacom active digitizer adds size and cost, it isn't common in modern tablets, and even Windows-based tablets usually refer to those broad-tipped and rather imprecise capacitive pens that are, for the most part, nearly useless.

    But all that may be about to change. The latest advances in capacitive touch enable passive pens with tips as narrow as that of a pencil (see Advances in capacitive touch and passive capacitive pens). That will be a boon to Windows tablets that still have all those tiny check boxes, scrollers, and other interface elements designed decades ago for use with a mouse. And it will be a boon to apps like OneNote (and Journal!).

    And with Android now dominating the tablet market as well, the availability of OneNote for Android with full handwriting/inking capabilities may mean that OneNote will find a much larger user base than it's ever had. That's because all of a sudden, the combination of OneNote and advanced passive capacitive pens will add a whole new dimensions to tablets, making them infinitely more suitable for content creation rather than just consumption.

    Posted by conradb212 at 10:13 PM

    July 31, 2014

    Android on the desktop!

    Though Android dominates the smartphone market and has a very strong position in tablets, until now Google's OS platform was not available for desktops and conventional notebooks (the Chromebook with its limited offline functionality doesn't really count).

    That has now changed with the new HP SlateBook, a full-function, quad-core Tegra 4-powered notebook with a 14-inch 1920 x 1080 pixel 10-point multi-touch screen, and running Android 4.3. The Slatebook weighs 3.7 pounds, offers up to 9 hours of battery life from its 32 Watt-Hour battery, has USB 3.0 and HDMI ports, 16GB of eMMC storage, a full-size keyboard, and starts at US$429.

    The HP SlateBook is not a rugged product, but its arrival will almost certainly have an impact on the rugged computing industry. In fact, I believe that Android availability on desktops and laptops may change everything.

    Think about it: While Microsoft continues to dominate the desktop and laptop, Redmond's presence in tablets is mostly limited to industrial and enterprise products. And while Windows CE and Windows Mobile/Embedded Handheld have managed to hang on in industrial handhelds for years longer than anyone expected, the handwriting is on the wall there, too. Between not having any upgrade or migration path between WinCE/WinMobile and Windows Phone, and Windows Phone itself just a distant also-ran in smartphones, we'd be very surprised to see any sort of dominant Microsoft presence on industrial handhelds in the future.

    One of the primary reasons why WinCE/WinMobile managed to hang around for so long was the leverage argument: enterprise runs Microsoft and enterprise IT knows Microsoft, so using Microsoft on handhelds means easier integration and less development and training costs. But with all of those hundreds of millions of iOS and Android tablets and smartphones, even Microsoft-based enterprise quickly learned to work with them and integrate them, so the leverage argument isn't what it used to be.

    On the desktop side, it will forever be unclear to me what drove Microsoft to craft Windows 8 as a dual-personality system with the largely useless (for now anyway) metro interface and a crippled legacy desktop that made no one happy. The 8.1 upgrade fixed a few of the worst ideas, but even as a face-saving retreat it was too little to address the basic structural problems. And whatever surfaces as Windows 9 will still have the unenviable task of seeking to perpetuate Microsoft's dominance of the desktop where most people still work, while watching the world go ever more mobile, on non-Microsoft platforms.

    I think that by far Microsoft's strongest argument is that a mouse-operated windowed multi-tasking environment on a large display remains superior for creative and office work to finger-tapping on a tablet. I think the Office argument isn't nearly as strong. Sure, Office has a virtual monopoly in the classic office apps, but perhaps 90% of its functionality is available from numerous other sources. Even as a journalist, writer and publisher I rarely use more than the barest minimum of Office's myriad of functions.

    And though at RuggedPCReview.com we're OS-agnostic, by far our platform of choice is the Mac. The painless OS upgrades alone would be enough to pick Apple, but what really clenches the deal is that when you get a brand-new Mac, you simply hook the Time Machine backup from your old one up to it and transfer everything to the new one, apps, data, settings and all. The next morning, whatever was on the old Mac is on the new one. No fuss at all. That is simply not possible with Microsoft.

    Anyway, I often thought how nice it'd be to load Android on some of the numerous old Windows machines in the office that we can no longer use because the Microsoft OS on them is no longer supported or -- worse -- the system decided it was now pirated because we put in a new disk or processor. I thought how nice it'd be to have Android on the desktop or a good notebook for those times when you simply do need the pin-point precision of a mouse, the solidity of a real desktop keyboard, the comfort of a real desk and office chair, or the ability to see everything on a big 27-inch screen.

    Or how nice it'd be to have exactly the same OS on such a big, comfortable, productive machine as well as on a handy tablet or a smartphone. I'd use that in a heartbeat.

    There are, of course, questions. Microsoft itself has wrestled for decades with ways to provide a unified Windows experience on various platforms, and almost all those attempts failed. Apple didn't fall into the one-OS-for-all-devices trap and instead chose to optimize the user interface to the platform's physical characteristics. But that came at the cost of a rather weak connection between the Mac OS and iOS (why Apple doesn't get more criticism for iTunes and Cloud has always been beyond me). And even if Android were to emerge, full-blown, on laptops and desktops, we'd soon miss having multiple windows open. I mean, flipping between full-screen apps might have been acceptable back in the late 1980s, but going back to that would be a giant leap into the past.

    Still, if full Android were available for desktops today, I'd drop everything I was doing to install it on several on the system in our office right this instance. And I am pretty certain that full Android availability on desktops and laptops would mean a massive renaissance for those currently beleaguered platforms.

    So the release of HP's Android SlateBook just may be a milestone event.

    Posted by conradb212 at 8:00 PM

    July 25, 2014

    Advances in capacitive touch and passive capacitive pens

    RuggedPCReview.com just finished reviewing the Panasonic Toughpad FZ-M1, and we ended up calling it a "milestone product" because of its novel and unique passive capacitive stylus that can greatly increase productivity because it allows for far greater precision when using the Windows desktop than touch alone or earlier capacitive pens.

    This article describes the past and current situation of touch in rugged tablet computers, and why the technology employed by Panasonic in the FZ-M1 is so relevant.

    Until the iPhone and then iPad popularized capacitive touch, tablets either used active or passive digitizers. Active digitizers used a special pen, with Wacom's electromagnetic system by far the most popular because its slender pen did not need a battery. Passive digitizers were mostly of the resistive variety, and worked with any stylus, piece of plastic or even a finger nail. Neither of these older digitizer technologies helped tablets make much of an impact beyond limited industrial and special market applications.

    Active pens allowed for fairly precise operation, smooth inking, and since the cursor followed the tip of the pen even without the pen touching the display, users always knew where exactly the cursor was, just like with a mouse. But tablets relying on active pens became useless if the pen was misplaced or lost, necessitating the addition of a touch digitizer, which added cost and the need for "palm rejection," i.e. keeping the system from mistaking the reassure of one's palm when writing with the pen as input.

    Passive resistive digitizers are still widely used, but they are a poor match for any user interface designed for operation with a mouse. That's because with them, the touch of a finger or the tip of a stylus is the equivalent of a left mouse click whether or not the point of contact was where the user intended. Resistive touch works well for specially designed applications with large buttons, but relying on it for precise operation, like using a word processor or spreadsheet, can be quite frustrating.

    Apple didn't invent capacitive touch, but the iPhone certainly made a compelling case for effortlessly navigating an operating platform specially designed for the technology, and the iPad really drove the point home a few years later. Tapping, panning, pinching and zooming quickly becomes second nature. To the extent where NOT being able to pan and zoom on a display has become almost a deal-breaker.

    While capacitive multi-touch certainly redefined the way tablets are used, and is largely responsible for the massive success of the tablet form factor, the technology isn't perfect, or at least not yet. It works very well in operating environments designed for it, such as iOS and Android, and with any app where tapping, panning, pinching and zooming gets the job done. But capacitive touch works far less well with software not originally designed for it, software that relies on the precision, small-fractions-of-an-inch movements of the mouse interface. That's why it remains clumsy to use word processors or spreadsheets or graphics apps with capacitive touch -- the precision required to operate them just isn't a good match for something as blunt as the tip of a finger. It's conceivable that a generation growing up with touch, kids who may have never used a mouse, won't see this as a shortcoming, but simply as the way touch works, and there'll undoubtedly be software that'll get the job done.

    As is, we have a situation where hundreds of millions still rely on software that was NOT designed for touch (Windows), and so the need to continue to use that software conflicts with the desire to use state-of-the-art tablet hardware that uses capacitive multi-touch. So why not use a combination of capacitive multi-touch and an active pen, combining the best of both words? That can be done, and is being done. Samsung has sold tens of millions of Galaxy Note "phablets" that combine capacitive multi-touch with a Wacom active digitizer pen (making it by far the most successful pen computer ever sold). Is it a perfect solution? Not totally. The Wacom digitizer adds cost, what you see on the screen tends to lag behind when the pen moves quickly, and things get imprecise around the perimeter. And if the pen gets lost, pen functionality of the device goes with it.

    Whatever the technical issues may be, we've now reached a point where customers pretty much expect capacitive multi-touch even for industrial and vertical market tablets. The tap / pan / pinch / zoom functionality of consumer tablets has just become too pervasive to ignore. So we've been seeing more and more rugged and semi-rugged tablets (as well as handhelds) using capacitive touch. That's no problem with Android-based tablets since Android was designed for capacitive touch but, unlike in the consumer market where iOS and Android dominate, but enterprise customers continue to demand Windows on their tablets. Which means a precise pen or stylus is pretty much mandatory.

    Now what about capacitive pens? They have been around since the early days of the iPad, using a broad rubber tip on a stylus to provide operation a bit more precise than is possible with a finger. How much more precise? That depends. Even slender index finger tips measure more than 10 mm whereas those capacitive styli have tips in the 7-8 mm range. That seems improvement enough for several manufacturers of rugged tablets to include capacitive styli with their products. The tips of those styli are narrower than those of earlier versions, but still in the 5 mm range, and they still have soft, yielding tips. They work a bit better than older ones, but in no way as well as a mouse or an active pen. Not much that can be done, or can it?

    It can.

    When Panasonic sent us a Toughpad FZ-M1 rugged 7-inch Windows tablet for review it came with a full-size plastic pen. I assumed it was a passive stylus and that the little Panasonic tablet had a resistive touch interface in addition to its capacitive multi-touch. But Panasonic clearly described the stylus as "capacitive." In use, the pen with its hard 1.5mm tip was quite precise, far more so than any capacitive stylus I've ever used. It only required a very light touch to operate the classic Windows user interface. Panning was easily possible, and unlike resistive styli that really can't be used for fine artwork or handwriting (they are too jerky), this stylus produced smooth lines. It also never fell behind, one of my pet peeves with Wacom pens. The picture above shows the Panasonic pen (left) sitting next to a couple of older capacitive styli.

    So what was it? I contacted Geoff Walker, RuggedPCReview's former technology editor and now Senior Touch Technologist at Intel, and asked his opinion. Geoff got back to me with a detailed email and said that the Toughpad FZ-M1 apparently was an example of the very latest in projected-capacitive touch controller capability. He explained that over the past couple of years the signal-to-noise ratio of touch controllers has been increased to a point where they can now recognize conductive objects no more than a millimeter thick. Whereas the controllers used in the original iPad may have signal-to-noise ratios of 10:1 or 20:1, the latest controllers can handle 3000:1. He suggested we try other objects on the Toughpad, and I found that a standard plastic stylus doesn't work, but a regular pencil does, as does a metallic stylus or even a metal paper clip. There's no pressure-sensitivity, which is something Wacom pens can do, but that's not really necessary for anything but art.

    But there's more. The Toughpad FZ-M1 also comes with a touch screen mode setting utility.

    Default mode is Pen / Touch, where both fingers and the pen are recognized.

    There's pen-only operation, like when drawing or writing, or whenever you don't want your fingers to inadvertently trigger an action. Geoff said that probably works by measuring the area of touch and ignoring signals above a certain size.

    Next is a glove touch setting. Geoff said this could be done by increasing the sensitivity of the touch controller so that it can recognize a finger even a brief distance away from the screen, as in the distance that the material of a glove adds to the finger's distance from the screen. That appears to be the case as the Toughpad FZ-M1 does work while wearing thin gloves. I thought it was the conductivity of the glove material, but apparently it's the thickness of the glove that matters.

    Then there is a Touch (Water) setting. Capacitive touch and water generally don't get along because, as Geoff explained, water is so conductive that it affects the capacitance between two electrodes, the concept upon which projected capacitive touch is built. What can be done, Geoff said, is switch from the standard mutual capacitance mode to self-capacitance where the capacitance between one electrode and the ground is measured. The capacitive pen doesn't work in this mode because a fairly large touch area is required. I tried this mode with water from a faucet running onto the display. The display still recognized touch and to a very limited extent two-finger operation, but this mode is mostly limited to tapping.

    What does all this mean? For one thing, it means that thick-tipped capacitive styli will soon be a thing of the past, replaced by far more precise passive capacitive pens with tips in the one-millimeter range, like the one available with the Panasonic Toughpad FZ-M1. Second, and I don't know if that's technically possible in Windows or not, but since the touch controller can sense the tip even when it's in close proximity of the screen, "hovering" or "cursor tracking" where a cursors follows the pen even without touching the screen (and thus triggering a left mouse click) may also be implemented. That's one useful thing active styli do that even advanced passive capacitive pens can't do (yet).

    Would all of this still matter if it were not for the need to better support the legacy Windows desktop and its mouse-centric applications? It would. Tablets, hugely popular though they are, are mostly used for content consumption. The emergence of precise pens will open tablets to a much broader range of content creation. -- Conrad H. Blickenstorfer, July 2014

    Posted by conradb212 at 4:14 PM

    June 30, 2014

    Reporting from the road -- sort of

    As editors of RuggedPCReview.com, we're probably better equipped than most to report anytime, anywhere, and under any conditions. After all, we not only have access to the latest and greatest mobile computing and communications gear, but much of that gear is designed to go just about anywhere.

    The reality is a bit different, as we learned the hard way on a stretch of several weeks on the road and high sea. Testing underwater still and video camera equipment around some of the West Indies islands in the Caribbean, we wished for two things. One was that cell service were truly global, without all the hassles of having to switch SIM cards, sign up for international service, and fear the ever-present phone company gotchas that can result in huge charges on the next monthly bill. Another was that the GPS apps on our various devices were not so completely dependent on cell coverage to work. They all have many gigs of storage, so why not include a reliable and reasonably precise base map that always works, no matter where you are on the globe?

    So while on the good ship Caribbean Explorer II, we were essentially cut off from the world. There are WiFi hotspots in ports, of course, but most are locked and the signal of those that are not was usually insufficient for anything other than frustration and perhaps a Facebook update or two.

    A subsequent extended road trip to the great state of Tennessee promised better coverage. Between a MacBook Pro, two iPads, and two iPhones, keeping in touch and getting work done seemed doable. To some extent anyway. It's amazing what all can be done workwise on a tablet these days but, truth be told, it can take a lot longer to get even simple things done. That's where the MacBook was supposed to come in. The big 15-inch Apple laptop doesn't have WWLAN, though, and so we signed up for AT&T's 5GB US$50 hotspot service on one of the iPads.

    Why paying for the iPad hotspot when there's WiFi coverage virtually everywhere these days? Because that WiFi coverage often isn't there when you need it most, or it doesn't work right, or sitting in a busy Starbucks just isn't the best place to get work done. The hotspot worked fine until, after just three days on the road, a message came up on the iPad saying we had used up the allotted 5GB and it was time to pay up for more. What? Apparently three days of moderate browsing and some work was enough to go through a full 5GB. I have no idea how that bit of activity could use so much data, but it did. That bodes ill for a future where increasingly everything is metered. Data usage is never going to go down, and metered data looks to be an incredible goldmine for the telcos and a massive pain for users.

    In the end, what it all boiled down to was that, yes, we're living in a connected world, but no, you're not really connected all the time and you can't ever rely on having service. And, surprisingly, digital data coverage remains a frustratingly analog experience. There's coverage, maybe, somehow, but it's marginal and you may or may not get a connection, and if you do it may not be good or persistent enough to get actual work done. From that standpoint, it's a mobile world, but one that requires you to be stationary to actually make it work.

    I often tell people that I can work anywhere, anytime, as long as I have internet access. But that's only true to some extent. Tablets and smartphones can only do so much. Even a full-function laptop is not likely to include every utility, file and tool that's on the desktop in the office. "The Cloud" truly cannot be relied on. And neither can data service.

    All of that will change someday, and hopefully someday soon. As is, the connected world is a work in progress.

    Posted by conradb212 at 3:49 PM

    April 23, 2014

    Unpacking the Xplore iX104 XC6

    So we get this box from Xplore Technologies, and it's pretty heavy. And it's a bit grimy. We figured we better open it outside. This is what happened:

    Yes, Xplore sent us the brand-spaking new iX104 XC6 to make a point. Sod: It can handle grime and dirt. Sunglasses: You can use it in bright sunshine. Measuring tape: You can drop it from seven feet. Ice cube tray: it's freeze-proof. Inflatable pool ring: it can handle full immersion.

    It also has a Haswell processor under the hood. And dual 128GB solid state disks in a RAID 0 arrangement. So equipped and still wet from the hose-down, the big, tough Xplore blasted to the fastest PassMark benchmark we ever recorded. Impressive.

    Posted by conradb212 at 12:48 AM

    April 7, 2014

    Durabook R8300 -- ghosts of GoBooks past

    There are things in life where the outrage and pain just never seems to go away. For me that includes the infamous game 6 in the 2002 basketball playoffs where the NBA stole the championship from my Sacramento Kings, the forced demise of the Newton, a relationship issue or two, and then there is the way General Dynamics took over that intrepid small town Itronix computer company up in Spokane, Washington, just to then ruin it and shut it down.

    There. Hundreds of people lost their job in Spokane when the corporate bigwigs at General Dynamics moved operations into some unfilled space in one of their buildings in Florida and then shuttered the Spokane facility where some of the most dedicated people in the industry had built rugged computers since the early 1990s.

    But wait... not absolutely everything is lost. You see, like most US computer companies, Itronix had a close working relationship with a Taiwanese OEM/ODM, in this case Twinhead. While the precise detail of the Itronix/Twinhead relationship is only known to insiders, it's safe to assume that there was close interaction between the Itronix designers and engineers up in Spokane, and their Twinhead counterparts in Taipei. This was not a case of a US company just slapping their label on a machine designed and made in Taiwan. It was a close cooperation, and most of the machines sold by Itronix were exclusives, meaning that no one else sold them under their brand and label.

    An example was the Itronix flagship GoBook III, which was replaced by the GoBook XRW, and then, once General Dynamics had inexplicably discarded the hard-earned brand equity in the "GoBook" name after they took over, the GD8000 (shown in picture) and its tech refresh, the GD8200. That machine and Getac's fully rugged notebooks were the Panasonic Toughbooks' primary competition in the rugged notebook market. Precise sales figures are hard to come by in this industry, but by most accounts Itronix had about a 12% marketshare.

    It's been over a year since Itronix was shuttered, but what should suddenly arrive but the GammaTech Durabook R8300. It immediately seemed familiar to me, and a closer look revealed that, yes, it's an updated version of the GD8200. The name alone gives a clue as GammaTech usually names its devices with a combination of letters and numbers centering around display size, like CA10 or S15H. Itronix named their machines by series, and so it was the GD4000, GD6000, and GD8000. The GD8200 refresh may have signified its move to 2nd generation Intel Core processors, in which case the R8300 name could be a combination of R for rugged, and 8300 paying both homage to the machine's origin and the switch to 3rd generation Ivy Bridge processors.

    Be that as it may, its origin and history instantly qualifies the Durabook R8300 as a serious contender in the rugged notebook market. Yes, a 4th gen Intel chip would have been nice, but keeping up with Intel's ever-changing generations isn't the highest priority in a class of machines where longevity and backward compatibility mean more than the very latest specs. As is, the R8300, having the same design and layout as the GD8000 and GD8200, will most likely work with all existing GD-Itronix docks and peripherals, and anyone seeking to replace aging 8000 Series Itronix notebooks should be thrilled.

    So at least some part of the longstanding, fruitful cooperation between Twinhead and Itronix lives on. The GD8200 was a terrific workhorse of a machine, and with the updated tech specs, the Durabook R8300 is certain to be even better.

    Posted by conradb212 at 6:41 PM

    March 19, 2014

    Getac's latest rugged convertible replaces the V8 with a turbo-4

    I love cars and often use automotive analogies to describe situations. One came to mind recently as we evaluated a particularly interesting new rugged mobile computer. So here goes:

    With gas mileage becoming ever more important in cars and trucks, the automotive industry has been pulling all stops to come up with more fuel efficient vehicles. One way to boost fuel efficiency is to reduce the weight of the vehicle. Another is to use smaller turbocharged motors to burn less fuel while providing the same performance.

    That came to mind when we did a full hands-on test with the new Getac V110 convertible notebook computer. It's very significantly lighter than its two predecessor models. And it uses a processor with less than half the thermal design power than those predecessor models. Yet, it offers close to the same performance and has even longer battery life.

    Here's how this likely came about. While Getac's early convertible notebooks had used low-voltage economical processors, subsequent models became more powerful and required larger batteries. That eventually pushed their weight in the six to seven pound range, quite a bit for devices meant to be carried around and occasionally used as tablets.

    So Getac's engineers went back to the drawing board and designed a new convertible notebook, using the latest space and weight-saving technologies to cut an ounce here and a fraction of a inch there. Most importantly, they switched to low-voltage versions of Intel's 4th generation Core processors that include new and superior power saving features. That allowed them to replace the massive battery of the older models with two small batteries, each no larger than an iPhone. The resulting design, the V110, weighs over two pounds less than the last V200 we tested, and a good pound and a half less than the old V100. The V110 is also much thinner.

    So as for the automotive analogy, Getac replaced a hulking V8-powered SUV with a much svelter, lighter one with a turbo-4.

    But does that work in the real world? For the most part it does. While peak processor performance of the V110 is close to that of the standard-voltage predecessor models, idle power draw is less than half. What that means is that in many routine applications, the re-engineered and much lighter V110 will get the job done on what amounts to half a tank compared to the predecessor models. There's just one area where the automotive analogy breaks down: whereas automotive turbos can suck a good amount of fuel under full load, the V110 remained quite economical even when pushed to the limits.

    Posted by conradb212 at 3:55 PM

    February 25, 2014

    Samsung Galaxy S5 -- raising the bar

    On February 24, 2014, Samsung showed their new flagship smartphone, the Galaxy S5. It's relevant because it'll sell by the many millions, and it'll be the primary opponent to Apple's current 5s and whatever Apple comes up with next. But the Galaxy S5 is also relevant because the technology it includes and provides will become the new norm. It'll be what people expect in a handheld. And that affects the rugged industrial and vertical markets, because what the Galaxy S5 (and soon many other consumer smartphones) offers is the benchmark, and people won't accept considerably less on the job.

    The Galaxy S5 is essentially a little tablet with phone functionality built in. It measures 5.6 x 2.9 x 0.32 inches and weighs five ounces. That's a lot larger than those tiny phones of a few years ago when the measure of phone sophistication and progress was how small a phone could be made. The S5 is really not pocketable anymore, but because it's so thin it still weighs less than half of what some of the original PDAs weighed.

    The Galaxy S5 screen measures 5.1 inches diagonally, and offers full 1920 x 1080 pixel resolution. That's the same as the current generation of giant flatscreen TVs, making the Galaxy's display razor-sharp, and essentially making a mockery of any claim to "high definition" that a current TV may have. And the screen uses OLED technology, which means no backlight and a perfect viewing angle.

    There are, of course, two cameras. The frontal one has 2mp resolution, which in theory means that vidcam conversations can go on in full 1080p video. The other one has 16mp, not as almost absurdly extreme (42mp) as the one Nokia offers in its flagship, but much higher than Apple, and the same you'd expect from a good dedicated camera. And it can even record in 4k video. If it's in full 30 frames per second, that's something even the vaunted GoPro Black Edition can't, and it'll mean that the vast majority of hyper-sharp 4k video may come from smartphones, and not from some sort of disc player (no 4k standard exists yet), not from streaming video (most internet connections still can't handle that), and not from dedicated cameras (there are hardly any for now).

    Memory, no surprises there. But I should mention once again that the 16 or 32GB of storage built into smartphones such as the new Galaxy S5 more than likely contributed to the impending demise of lower-end dedicated digital cameras which, for some unfathomable reason, still only have 32MB (megabyte, not gigabyte) on board, one thousand times less than a smartphone.

    Communication? The fastest available WiFi (802.11ac) and the latest rev of Bluetooth (4.0) and also NFC, near field communication, sort of a version of RFID. Then there's a fingerprint scanner, and—also new—a pulse scanner than can include heart beat into all sorts of apps and systems. I've long believed that sensors are a field of advance and new capabilities, providing handhelds ever more information about their environment, and thus making them ever more useful.

    And it's all powered by a 2.5GHz quad-core Snapdragon 800 processor that makes older processor technology look like cast iron flathead six designs from the 40s compared to a computer-controlled modern turbo with variable valve timing.

    Setting the bar even higher, this new Galaxy phone carries IP67 sealing, which means it's totally dust-proof, and also waterproof to the extent where it can survive full immersion into water down to about three feet. IP67 has long been the holy grail of rugged computer sealing, and now Samsung offers it in a consumer smartphone that's an eighth of an inch thick.

    I think it's fair to say that the bar has been raised. Dedicated industrial and professional tools will always do this thing better or that, have better dedicated components such as scanners, and they can, overall, be much tougher than any eighth-inch smartphone can ever be. But this level of technology so freely available to millions is simply certain to have an impact on expectations.

    Posted by conradb212 at 3:48 PM

    February 18, 2014

    Android parlor trick

    Just a brief entry here....

    Up to Android "Jelly Bean," i.e. versions 4.1.x through versions 4.3.x, one of the cool things about Android was the (relative) ease with which one could do screen grabs. Those, of course, are essential to product reviewers. And so it was good to know that all one had to do was connect the Android device to a PC or Mac, fire up the Android SDK, click Applications > Development > USB debugging on, and grab those screens.

    That's what I wanted to do when I recently upgraded an Android tablet from "Ice Cream Sandwich" to "Jelly Bean." Unfortunately, Applications > Development > USB debugging was no longer there, and there seemed nothing else that would allow access to the debugging mode. Google to the rescue.

    Well, turns out that getting into Android debugging mode now involves a secret handshake. You go to About Tablet, then click on Build SEVEN TIMES. That enables the Developer options menu, where you need to click on USB Debugging. That's about as non-obvious as it gets, and probably reflects Google's efforts to keep all those hundreds of millions of Android users from hurting themselves by accidentally disabling their device.

    That probably makes sense. I still believe one of the reasons why Linux never really made it big as an OS for the masses is because its creators insisted to leave the arcane technical underbelly more or less visible to all. As Android matures, Google can't allow that to happen. Just like "View Page Source" has now vanished from easy view on all major browsers.

    But it's a good party trick. Next time you see a techie at a cocktail party or else, test him or her by innocently asking "How do I get into debug mode under Jelly Bean??"

    Update 2016-08-30: To enable USB debugging in Android 5.1.x Lollipop, Settings>About Device. Tap Build number seven times.

    Posted by conradb212 at 5:54 PM

    January 12, 2014

    More 4k video contemplations

    All of a sudden, everyone is talking about 4k video. Also known as Ultra-HD video, four times the resolution of the 1920 x 1080 pixel 1080p standard, 4k was everywhere at the Consumer Electronics Show in Las Vegas. Now, obviously, 4k video isn't the most important thing on rugged mobile computer manufacturers' minds, but 4k video is nonetheless a sign of changing times. And with some consumer smartphones already offering full 1080p resolution on their small screens, and consumer tablets going well beyond that, it's only a matter of time until rugged and industrial market customers, too, will demand much higher resolution in their professional computing gear. So keeping track of what's happening out there with ultra-high resolution makes sense.

    As I stated in an earlier essay on high resolution (Thoughts about display resolutions, December 2013), I recently purchased a 39-inch Seiki 4k flatscreen display that can be used as a monitor or as a TV. It was an impulse buy, and I justified the (remarkably reasonable) price by deciding the Seiki was a research investment that would help us here at RuggedPCReview.com learn more about how 4k video worked, what was possible, and what wasn't.

    On the surface, 4k video makes an awful lot of sense. 1080p HD video was just perfect six or seven years ago for the emerging flood of 40-50 inch flatscreens. But flatscreens have grown since then, and 65, 70 and even 80 inches are now the norm. As you can imagine, the same old 1080p video doesn't look nearly as sharp on screens with two, three, or four times the real estate, or more. So doubling the resolution in both directions makes perfect sense.

    And it's a great opportunity to infuse new life and excitement into the flatscreen TV market. Three years ago, everyone offered 3D TVs. An amazing number were sold, amazing given that there was virtually no 3D content. And amazing considering one had to wear those annoying 3D glasses. So 3D quietly became just a built-in feature in most new TVs, but it's no longer a selling point. 4k video, on the other hand, IS now a selling point. And it'll become even bigger as time moves on.

    The problem, though, is the same as it was with HD, initially, and then with 3D: no content. There is no native 4k video standard for storage or players. There are no 4k players and only a very few recorders. none of which is mature at this point.

    So we did some testing to see what's possible, and what's not possible. The goal was to see whether it's actually possible to get 4k video without breaking the bank. So here's what we did, and how it worked out so far.

    What can you do today with a 4k display such as our 39-inch Seiki? Well, you can watch regular HD TV on it via your satellite or cable setup. You can hook it up to video game consoles. You can connect it to streaming video gizmos like Apple TV, Google Chromecast, or the various Roku type of devices. Or you can connect a tablet, notebook or desktop to it. Sadly, almost none of these support resolutions higher than 1080p. Which means that on a 4k display you get video that may or may not look better than on a regular 1080p display. May or may not, because some devices do a decent job at "upscaling" the lower res. For the most part, though, displaying 1080p content on a 4k screen is a bit like running early iOS apps in "2X" mode, where each pixel was doubled in each direction. That's not impressive.

    What about hooking up a notebook or desktop to the 4k screen? Well, none of the various computers around our offices supported more than 1080p. And the one Windows desktop I use most often for testing is actually a rather old HP with just a 2.2GHz Core 2 Duo processor. That's still good enough to run Windows 8.1 at a decent clip, but the video from the HP's Asus motherboard maxed out at 1680 x 1050 pixel. So it was time to look around for a video card that could actually drive 4k video.

    That, my friends, was a sobering experience for me as I realized how little I knew of current video card technology. Sure, we cover whatever Intel bakes into its latest generation of Core processors, and have a degree of familiarity with some of the discrete graphics subsystems available for various rugged notebooks. But beyond that there's an incredibly complex world of dedicated graphics chips, interface standards, different connectors, as well as an endless array of very specialized graphics features and standards they may or may not support.

    I am, I must admit, a bit of a gamer and love spending some relaxing hours playing video games on the Sony and Microsoft consoles. A particular favorite of mine is Skyrim, and so I bought a copy for the PC, to see what it would look like on the Seiki 4k screen. Well, initially I couldn't get the game to work at all on the old HP desktop as its motherboard video didn't support one feature or another. Now it was definitely time to invest in a graphics card.

    Several hours of Googling and reading up on things yielded only a vague idea on what might be a feasible solution to our video issue. You can, you see, easily pay more for a sophisticated video card than you pay for an entire computer. And some of those cards need two expansion slots, have large fans, and require massive power supplies to even run in a system. That was out. Was it even possible to get a decent video card that would actually drive 4k video AND work in an old PC like our HP?

    The next pitfall was that on Amazon and eBay you really never know if something is the latest technology, or old stuff from a few years ago. Vendors happily peddle old stuff at the full old list price and it's all too easy to get sucked in if you are not totally up to speed. So always check the date of the oldest review.

    What eventually worked best was checking some of the tech sites for recent new video chip intros. nVidia and AMD have practically the entire market, and are locked in fierce competition. The actual cards may come from other sources, but they will use nVidia or AMD chips. a bit more research showed that AMD had recently introduced Radeon R7 chips for reasonably priced graphics cards, and those cards actually appeared to support 4k video. And use the PCI Express x16 slot that my old desktop had. I did truly not know if that was the same connector and standard (almost every new Intel chip uses a different socket), but it looked the same, and so I ordered a Gigabyte Radeon R7 250 card with a gigabyte of GDDR5 memory on Amazon for amazingly affordable price of US$89, and no-cost 2-day shipping with Amazon Prime.

    The card promptly arrived. And it fit into the x16 slot in the old HP. And the HP booted right up, recognized the card, installed the AMD drivers from the supplied DVD, and did nice 1680 x 1050 video via the card's DVI port on the vintage 22-inch HP flatscreen that had come with the PC. Skyrim now ran just fine.

    So it was time to see if the Radeon card would play ball with the Seiki 4k screen. Using a standard HDMI cable, I connected the old HP to the Seiki and, bingo, 4k video came right up. 3840 x 2160 pixel. Wow. It worked.

    Windows, of course, even Windows 8.1, isn't exactly a champion in adapting to different screen resolutions, and so it took some messing around with control panels and settings to get the fonts and icons looking reasonably good. And while I have been using a 27-inch iMac for years as my main workstation, 39-inch seems weirdly large. You'd watch a TV this size from a good distance, but for PC work, you're real close and it doesn't feel quite right.

    So now that we had an actual 4k video connection to a 4k display, it was time to look for some 4k content. YouTube has some (search "4k video demos"), and so we tried that. The problem there was that running it requires substantial bandwidth, and our solution—a LAN cable connected to a power line connector to run the signals through the building wiring, and then to our AT&T broadband—apparently wasn't up to snuff. So we saw some impressive demo video, very sharp, but more often than not it stopped or bogged down to very low frame rates. So bandwidth will be an issue.

    We also perused pictures in full resolution. 4k is over 8 megapixel in camera speak, and so you can view 8mp pictures in full resolution. The result, while quite stunning from a distance, is actually a little disappointing from up close. The JPEG compression that is usually hardly noticeable on smaller screens is obvious, and then there's the fact that even 4k resolution on a 39-inch screen isn't all that much. It's really just in the pixel density range of an older XGA (1024 x 768) 10-inch tablet, and those have long been left behind by "retina" and other much higher resolution screens.

    Then we cranked up the Skyrim game and ran that full-screen. It looked rather spectacular, though probably ran in 1080p mode because full 4k would require a good deal more video RAM and modified textures to really be in full 4k resolution. It did look good, but it also bogged down.

    After an hour of playing with the 4k video setup I began feeling a bit nauseous and had to stop. The reason, apart from not being used to the large screen from so close up, was almost certainly a serious limitation of the 39-inch Seiki—it runs 4k video at a refresh rate of only 30 Hertz. That is very low. Most modern computer displays run at 65-85 Hertz. You don't actually see flickering, but I am convinced the brain has to do some extra work to make sense of such a low frame rate. And that can make you nauseous.

    One of the original triggers of my impulse decision to get the 4k Seiki screen was to watch 4k video from our GoPro 3 Black cameras. The GoPro's 4k, unfortunately, is really also just a technology demonstration for now, as it runs at just 15 frames per second. Normal video viewing is at 30 fps, and games and other video may run at much higher frame rates yet. So until we can view 4k video at 30 fps and more, it's just an experiment.

    So that's where we stand with 4k video. There's a vast discrepancy between the marking rhetoric that's pushing 4k now, and the fact that there's almost no content. And significant technical barriers in terms of frame rates, bandwidth, standards, and existing hardware and software that just can't handle it. It's a bit like Nokia trying two tell people you need a 41mp camera in a phone when the phone itself can display less than a single megapixel, and it would take more than four 4k screens in a matrix to display a single picture in full resolution.

    In summary, I did not go into great detail in my investigations, and there will be many out there who have much more detailed knowledge of all those standards and display technologies. But we did take a common sense look at what 4k can and cannot offer today. The following about describes the situation:

    - 4k displays will soon be common. Dell already offers a range of affordable models (like this one).
    - 4k video support is rapidly becoming available as well, and 4k video cards starts at well under $100.
    - Some Intel Haswell chips offer integrated 4k video support.
    - There's virtually no 4k content available.
    - 4k uses more bandwidth than many current linkups can supply.
    - The 4k experience is in its infancy, with insufficient refresh rates and frame rates.
    - BUT it's clearly the future.
    - 4k rugged displays and signage systems are rapidly becoming available.
    - Do spend the time learning what it all means, and how it fits together.

    Posted by conradb212 at 1:02 AM

    December 26, 2013

    Does your Pentium have an Atom engine?

    There was a time in the very distant computing past where, when buying a computer, the sole decision you needed to make was whether to use the Intel 386/33 or save a few bucks and get the slightly slower 386/25. Today, if you use Intel's handy ARK app that lists every product available from Intel, there's a staggering 1,874 different processors listed. That includes processors targeted at desktop, server, mobile and embedded computers, but even if you leave out servers and desktops, there's still the choice of 949 processors for mobile and embedded applications. Not all of them are state-of-the-art, but even chips designated as "legacy" or "previous generation" are still in widespread use, and available in products still being sold.

    The mind-blowing number of Intel processors available brings up the question of how many different Intel chips the world really needs. As of the end of 2013, Apple has sold about 700 million iPhones, iPads and iPhone Touch devices that made do with a mere dozen "A-Series" chips. Not too long ago, tens of millions of netbooks all used the same Intel Atom chip (the N270). So why does Intel make so many different chips? Even though many are based on the same microarchitectures, it can't be simple or cost-efficient to offer THAT wide a product lineup.

    On the customer side, this proliferation serves no real purpose. End users make almost all purchasing decisions based on price. Figure in desired screen and disk sizes, and whether there's an Atom, Celeron, Pentium, Core i3, Core i5, or Core i7 chip is inside is confusing at best. For hardware manufacturers it's worse, as they must deal with very rapid product cycles, with customers both demanding legacy support AND availability of the latest Intel products. THEY must explain why this year's Intel crop is so much better than last year's which was so much better than what Intel offered the year before. Or which of a dozen of roughly identical Intel chips makes the most sense.

    As is, Intel has managed to bewilder just about anyone with their baffling proliferation of processors, and without the benefit of having established true brand identities. What Intel might have had in mind was kind of a "good, better, best" thing with their Core i3, i5 and i7 processors, where i3 was bare-bones, i5 added Turbo mode and some goodies, and i7 was top-of-the-line. But that never really worked, and the disastrous idea to then come up with a generation-based system that automatically made last year's "generation" obsolete only adds to the confusion. And let's not even get into Intel "code names."

    Atom processors were supposed to provide a less expensive alternative to the increasingly pricey Core chips—increasingly pricey at a time where the overall cost of computers became ever lower. Unfortunately, Intel took the same approach with Atom as Microsoft had taken with Windows CE—keep the line so wingclipped and unattractive that it would not threaten sales of the far more profitable Windows proper and mainline Intel chips. At RuggedPCReview we deeply feel for the many vertical and industrial market hardware and systems manufacturers who drank Intel's Atom CoolAid just to see those Atom processors underperform and in quick need of replacement with whatever Intel cooked up next.

    But that unfortunate duality between attractively priced but mostly inadequate entry level Atom chips and the much more lucrative mainline Core chips wasn't, and isn't, all. Much to almost everyone's surprise, the Celeron and Pentium brands also continued to be used. Pentium goes back to circa 1990 when Intel needed a trademarkable term for its new processors, having gotten tired of everyone else also making "386" and "486" processors. "Celeron" came about a few years later, around 1998, when Intel realized it was losing the lower end market to generic x86 chipmakers. So along came Celeron, mostly wingclipped versions of Pentiums.

    Confusingly, the Celerons and Pentiums continued to hang around when Intel introduced its "Core" processors. The message then became that Pentiums and Celerons were for those who wouldn't spring for a real Core Duo or Core 2 Duo processor. Even more curiously, Pentiums and Celerons still continued when the first generation of the "modern era" Core processors arrived, and then the second, third and forth generation. Study of spec sheets suggested that some of those Pentiums and Celerons were what one might have called Core i1 and i2 chips, solutions for when costs really needed to be contained to the max. In some cases it seemed that Intel secretly continued its long-standing tradition of simply turning off features that were really already part of those dies and chips. A weird outgrowth of that strategy were the last-ditch life support efforts of the dying netbook market to answer the calls for better performance by replacing Atom processors in favor of Celerons that were really slightly throttled Core i3 processors. That actually worked (we have one of those final netbooks in the RuggedPCReview office, an Acer Aspire One 756, and it's a very good performer), but it was too little, too late for netbooks, especially against the incoming tide of tablets.

    Given that the choice of mass storage, the quality of drivers, keeping one's computer clean of performance-zapping gunk and, most of all, the speed of one's internet connection (what with all the back and forth with The Cloud) seems to have a far greater impact on perceived system performance than whatever Intel chip sits in the machine, it's more than curious that Celeron and Pentium are not only hanging around, but have even been given yet another lease on life, and this one more confusing than ever.

    That's because Intel's latest Atom chips, depending on what they are targeted at, may now also be called Celerons and Pentiums. It's true. "Bay Trail" chips with the new Atom Silvermont micro architecture will be sold under the Atom, Celeron and Pentium brand names, depending on markets and chip configurations. Pontiac once took a heavy hit when the public discovered that some of their cars had Chevy engines in them. Pontiac is long gone now, and General Motors has ditched other brands as well, realizing that confusing consumers with too many choices made little sense. Even GM, however, didn't have anywhere near the dominance of their market as Intel has of its market.

    Where will it all lead? No one knows. Intel still enjoys record profits, and other than the growing competition from ARM there seems little reason to change. On the other hand, if the current product strategy continues, four years from now we may have 8th generation Core processors and 4,000 different Intel chips, which cannot possible be feasible. And we really feel for the rugged hardware companies we cover. They are practically forced to use all those chips, even though everyone knows that some are inadequate and most will quickly be replaced.

    PS: Interestingly, I do all of my production work at RuggedPCReview.com on an Apple iMac27 powered by a lowly Core 2 Duo processor. It's plenty fast enough to handle extreme workloads and extensive multitasking.

    Posted by conradb212 at 6:35 PM

    December 13, 2013

    Michael Dell's keynote at Dell World 2013: reaching for the cloud

    One big problem with being a public company is that every three months it's imperative not to disappoint analysts and investors. Dell won't have to worry about that anymore because it returned to being a private company. That means Dell can now take the longer look, pursue the bigger picture, and no longer suffer from the infliction of short term thinking, as Michael Dell so eloquently put it in his keynote address at the 2013 Dell World conference in Austin, Texas.

    And that was the core of Michael Dell's message, that as a private company Dell now has the freedom to make the bold moves that are necessary, investing in emerging markets, and take a long-term view of their investments. Dell said he senses a new vibe of excitement at the company that bears his name, and that he feels like he is part of the world's largest startup.

    He did, of course, also mention that sales were up in the double digits, that 98% of the Fortune 500 are Dell customers, that Dell has established a new Research Division and also the Dell Venture Fund. Dell reminisced how he had been taking an IBM PC apart in his dorm room, found that many of the components weren't actually by IBM but steeply marked-up 3rd party components, and how he felt he could make that technology available cheaper and better. He expounded on a recurring theme in his keynote, that the proliferation of computer technology between roughly 1985 and 2005 also saw poverty in the world cut by half.

    Dell showed an Apple-esque commercial that will run in 2014 and plays homage to all the little companies that started with Dell, including the one in dorm room #2714 (Dell itself, founded on a thousand bucks). Nicely done and charming. He spoke of how the future is the move beyond products and to end-to-end scalable solutions. He spoke of $13 billion invested in research, highlighted the Dell PowerEdge servers that are in the top two in the world, demonstrated, right on stage, how integrated server, storage and network functionality in fluid cache technology clocked in at over 5 million iops (input output operations per second).

    Dell spoke of their new mantra, to transform, connect, inform, and protect. Transform as in modernize, migrate and transition to the future. Connect as in connecting all the various computing devices out there, including the Dell Wyse virtual clients, because, still and for a good time to come, "the PC, for a lot of organizations, is how business is done." Inform, as in turning data into useful, productivity enhancing results, making companies run better with data analytics. And protect as in offering next gen firewalls and connected security to keep data safe and ward off attacks before they even happen.

    Dell also reminded that the company has been the leader in displays for over a decade, and touched on 4k2k video resolution that is available from Dell now, another example of Dell making disruptive technology available at accessible prices.

    Dell then introduced Elon Musk, who had driven in in a red Tesla Model S, his company's groundbreaking electric car. Along came David Kirkpatrick who took on the job of engaging Musk interview style. Musk, of course, also pioneered PayPal, and, in addition to Tesla, runs SpaceX, sending rockets into space. Musk was there, however, not so much to discuss technology, but to illustrate the impact of pursuing the big picture, big ideas, things that simply need to get done. As if on cue, Dell rejoined the conversation, congratulating Musk and bemoaning the infliction of short term thinking that hamstrings progress and big ideas and bold bets.

    Musk said this must be the best time in human history, where "information equality" breaks down barriers, making the world a safer, better place, meshing with Dell's clear belief that technology is a boon for mankind. Today, Musk said, anyone with internet access has more information available than the very President of the United States had just 30 short years ago. Musk also expressed regret over a pessimistic, negatively biased media that always seems to seek the negative.

    The address concluded with an observation by Kirkpatrick about Tesla cars' constant connection with the Tesla's computers, and how that information feedback and back and forth is used to make the cars better. Just as, Dell said, his company's approach with customers, a constant back and forth, and constant feedback in their quest to improve and innovate.

    This 75 minute keynote was a bold move with a big picture and a big message, with Dell, Musk and Kirkpatrick almost like actors in a play. Quite a task to pull this off, but the message was loud and clear and finely tuned: Dell is now free to pursue big ideas. Like Elon Musk with his electric car and rockets shooting into space, Dell can now reach for the cloud(s), and beyond.

    Posted by conradb212 at 2:57 PM

    December 5, 2013

    Thoughts about display resolutions

    The resolution of computer displays is an interesting thing. There are now handhelds with the same number of pixels as large flatscreen TVs, Apple claims its "retina" displays are so sharp that the human eye can no longer see individual pixels, and the very term "high definition" is in the process of being redefined. Let's see what happened with display resolution and where things are headed, both in handhelds and in larger systems, and what 4k2k is all about.

    Color monitors more or less started with the original IBM PC's 320 x 240 pixel resolution. In 1984, the monochrome Hercules video card provided 720 x 350 pixel for IBM compatibles. For a long time IBM's 640 x 480 VGA, introduced 1987 with their PS/2 computers, was considered a high resolution standard for the desktop and for notebooks. Then came 800 x 600 pixel SVGA and 1024 x 768 pixel XGA, and those two hung around for a good decade and a half in everything from desktops to notebooks to Tablet PCs. Occasionally there were higher resolutions or displays with aspect ratios different from the 4:3 format that'd been used since the first IBM PC, but those often suffered from lack of driver and software support, and so pretty much everyone stayed with the mainstream formats.

    It was really HDTV that drove the next advance in computer displays. During a 1998 factory tour at Sharp in Japan I had my first experience with a wide-format TV. It looked rather odd to me in my hotel room, sort of too wide and not tall enough and not really a TV, but, of course, that turned out where things were going. In the US, it would take a few more years until the advent of HDTV brought on wide-format, and terms such as 720p and 1080p entered our tech vocabulary. For a good while, smaller and less expensive flatscreen TVs used the 1280 x 720, or "720p," format, while the larger and higher end models used full 1920 x 1080 pixel "1080p" resolution. That meant a wide 16:9 aspect ratio.

    Desktop and notebook displays quickly followed suit. The venerable 1024 x 768 XGA became 1366 x 768 WXGA and full 1920 x 1080 pixel displays became fairly common as well, albeit more on the desktop than in mobile systems. Professional desktops such as the 27-inch Apple iMac adopted 2560 x 1440 resolution. On the PC side of things standards proliferated in various aspect ratios, resulting in unwieldy standards terminology such as WSXGA+ (1680 x 1050) or WQXGA (2560 x 1600).

    An interesting thing happened. Whereas in the past, TVs and computer displays had very different ways to measure resolution, they're now more and more the same, what with flatscreen TVs really being nothing more than very large monitors. The 1920 x 1080 pixel 1080p format, in particular, is everywhere. Amazingly, that's becoming a bit of a problem.

    Why? Because as TVs become ever larger, the same old 1080p resolution no longer looks quite as high definition as it did on smaller screens. Put a 42-inch and a 70-inch TV next to each other and you can plainly see the degradation in sharpness. The situation isn't as drastic on notebooks because, after growing ever larger for many years, notebook displays have leveled off in size, and have even shrunk again (Apple no longer makes the 17-inch MacBook Pro, for example). Desktop monitors, however, keep getting larger (I use two 27-inch monitors side-by-side), and that means even "high definition" 1920 x 1080 doesn't look so good anymore, at least not for office-type work with lots of small text. While I was excited getting a reasonably priced HP Pavilion 27-inch 1080p IPS monitor for the PC sitting next to my Mac, I find it almost unusable for detail work because the resolution just isn't high enough to cleanly display small text.

    While current resolution standards are running out of steam on larger displays, the situation is quite different in the small screens used on handhelds and smartphones. There we have a somewhat baffling dichotomy where many industrial handhelds still use the same low-res 320 x 240 QVGA format that's been used since the dawn of (computer) time, whereas the latest smartphones have long since moved to 1280 x 800 and even full 1920 x 1080 resolution. Tablets, likewise, pack a lot of pixels onto the rather small 7-inch and 10-inch formats that make up the great majority of the tablet market. Apple led the way with the "retina" 2048 x 1536 pixel resolution on the 3rd generation iPad. That's like a 2x2 matrix of 1024 x 768 pixel XGA displays all in one small 9.7-inch screen. Trumping even that, the latest Kindle Fire HDX tablet packs an astounding 2560 x 1600 pixel onto its 8.9-inch screen. So for now, smartphones and tablets are at the front of the high-resolution revolution.

    Somehow, we quickly get used to higher resolution. Older displays that we remember looking great now look coarse and pixellated. With technology you can never go back. The state-of-the-art almost instantly becomes the acceptable minimum. Whereas our eyes used to expect a degree of blurriness and the ability to see individual pixels on a screen, that's less and less acceptable as time goes on. And it really does not make much sense to declare 1080 as "high definition" when by now that resolution is used on anything between a smartphone and an 80-inch TV.

    Fortunately, the next thing is on the horizon for TVs and monitors, and it's called 2k4k, which stands for 2,000 x 4,000 pixel. 2160p would be a better name for this likely standard as it is simply a 2x2 matrix of current four current 1080p resolution displays, or 3,840 x 2,160 pixel. That still only means that a giant new 80-inch screen will have no more than the pixel density of a 1080p 40-inch display, but it's certainly a logical next step.

    I had all of this on my mind, when I received an email offer from one of my favorite electronics places. It was for a 39-inch Seiki TV/monitor with 4k resolution for a very attractive price and free shipping. I impulsed-ordered it on the spot, telling myself that I need to know where the 4k technology stands and what, at this point, it can and cannot do. And this would finally be a monitor where I could watch the 4k video my GoPro 3 Black Edition can produce.

    So I got the Seiki and it's a great deal and bargain. Or it would be if I actually had anything that could drive a 2k4k display in its native mode, which I don't. In fact, at this point there is virtually nothing that can drive a 4k display in full 4k 3840 x 2160 pixel resolution. Yes, the 4k videos from my GoPro 3 Black Edition would probably look great on it, but that would require me to copy the video footage to a PC that can drive an external 4k monitor, which virtually no stock PCs can do today. DVD or Blu-Ray players certainly can't display in 2k4k, and even brand-new gear like the Sony PS4 game console can't. I COULD, of course, get a low-end 4k-capable video card from AMD, but I am not sure any of the PCs in the RuggedPCReview office could actually even accommodate such a card.

    The unfortunate truth is that as of late 2013, there's very little gear that can send a true 4K video signal to a 4K TV or monitor. Which means that most content will be viewed in up-sampled mode, which may or may not look great. This will undoubtedly become a marketing issue in the consumer space—there will be great interest and great expectations in 4K TVs, but just as was the case with 3D TVs a couple of years ago, there will be virtually no 4K sources and content. And that can make for a customer backlash. There are some very detailed news on Amazon (see here) that provide an idea of where things stand.

    What does all that mean for rugged mobile technology? Not all that much for now, but I am certain that the ready availability of super-high resolution on smartphones and consumer tablets will change customer expectations for rugged device displays just as capacitive touch changed touch screen expectations. Once the (technology) cat's out of the bag, that's it. It won't go back in.

    And just as I finished this entry, I see that Dell announced 24-inch and 32-inch UltraSharp monitors with 4k 3840 x 2160 resolution, and a 28-inch version will soon follow (see Dell news). Given that Dell is the leading flat-panel vendor in the US and #2 in the world, that likely means that we'll soon see a lot more systems capable of supporting 4k resolution.

    Posted by conradb212 at 4:53 PM

    November 14, 2013

    State of Outdoor-Viewable Displays Late 2013

    One of the big differentiating factors in ruggedized mobile computers is how well the display is suited for work outdoors in bright daylight and in direct sunlight. This can make the difference between a device being useful and productivity-enhancing, or frustrating and nearly useless.

    Why is this such a big issue? Aren't today's displays so good that the only thing that matters is how large a display you want, and perhaps what resolution it should have? For indoor use that's true, but outdoors it's an entirely different story.

    The outdoor viewability problem

    Overall, LCD displays have come a very long way in the last two decades. If you're old enough, you probably remember those very small notebook displays that you could barely read. And if you looked at them from an angle, the color—if you had color—shifted in weird ways. Almost all of today's LCD displays are terrific. They are bright and sharp and vibrant, and you can view them from almost any angle (and depending on the technology, from all angles). Most of today's tablet and notebook displays are so good that it's hard to imagine they could get any better.

    Until you take them outdoors, that is.

    The difference between indoors and outdoors is amazing. A screen that is bright indoors will almost wash out when you take it outdoors on a sunny day. That's because even a very strong backlight is no match for the sun. Even very good displays become almost unreadable when they are facing the sun. The contrast goes away, the screen may go dark or it may become so reflective that it can't be used anymore. Some displays also assume strange hues and casts and colors when in the sun. Others have a shimmering iridescent look that distracts the eye. And resistive touch screens have a slightly yielding top surface that can make for optical distortions that can be very distracting.

    Matte and glossy displays

    Most notebook and tablet displays these days are very glossy. That's a trend that started perhaps a decade ago in Japan where vendors hoped bright, glossy screens would be more easily noticed in crowded electronics shop windows. Another argument for glossy displays was that they make the picture "pop" with rich color and sharp contrast when watching video or movies. That probably depends on the individual viewer, but overall glossy screens work well enough indoors (where there are few reflections) that virtually all manufacturers switched to them. Outdoors where there are a lot of reflections, glossy displays can be very hard to view.

    Some tablets and notebooks have matte screens or they have anti-glare coatings. A "matte" surface can be achieved via a liquid coating with tiny particles that then diffuse light, chemical etching that makes for a rougher surface, or mechanical abrasion. The much reduced intensity of light reflection makes matte surfaces ideal for office workers. You'd think that matte displays are also the answer for good outdoor viewability, and matte displays can indeed handle reflections much better outdoors as well. The problem, though, is that matte screens, especially older ones, just diffuse the light. When that happens outdoors where light can be very strong, the screen can turn milky and becomes hard or impossible to read.

    The different display technologies

    Most of today's standard LCDs are transmissive, which means that you have a backlight behind the LCD. This approach works great indoors because the ratio between the strength of the backlight and the reflected ambient light is very large. Outdoors, the ambient light is much stronger, and so the ratio between the strength of the backlight and the amount of reflected light is much smaller, which means there is much less contrast.

    In the past, notebook manufacturers tried different approaches to make the screens readable outdoors.

    One approach was to use reflective LCDs instead of transmissive ones. This way, the brighter the sunlight, the more readable the display becomes. This never caught on for two reasons. First, since you couldn't use a backlight, you needed a sidelight to make the screen viewable indoors. That just doesn't work with displays larger than those in a PDA. Second, even outdoors, the screens looked flat because the LCD background was greenish-brown, and not white.

    Another approach was "transflective" screens. Transflective screens were part transmissive so that you could use a backlight, but also part reflective so you could see them outdoors. This was supposed to be the best of both worlds, but it was really just a compromise that didn't work very well. So early transflective display technology was abandoned.

    Today, most outdoor displays use what one might call modified standard transmissive technology. These screens perform just like standard transmissive displays indoors, while controlling reflections and preserving contrast outdoors. They do that with various tricks and technologies such as optical coatings, layer bonding, and circular polarizers to reduce reflected light. The overall goal of all these measures is to control distracting reflections and to get the best possible screen contrast. That's because for keeping a display readable outdoors and in sunlight, preserving display contrast is more important than anything else. That's where effective contrast comes into play.

    Getac's QuadraClear brochure shows the effect of linear and circular polarizers

    Almost all major manufacturers of ruggedized mobile technologies have their own special approaches such as "QuadraClear" (Getac), "CircuLumin" (Panasonic), "MaxView" (Handheld Group), "xView Pro" (MobileDemand), "View Anywhere" (Motion Computing), IllumiView (Juniper Systems), "AllVue" (Xplore Technologies), and more.

    What matters is effective contrast

    There are various definitions of contrast. The most important one in outdoor displays is the "effective" contrast ratio, which doesn't deal with the annoying mirror-like reflections glossy screens are infamous for, but rather with the sum-total of the light reflected by the various layers a typical LCD assembly consists of. The effective contrast ratio is the ratio between that reflected light and the light generated by the display's own backlight.

    There are, in essence, two major ways to control those internal reflections. One is adding circular polarizers that combines a linear polarizer and a retardation film to block reflected light. The other is bonding together layers of the LCD assembly, thus eliminating two reflecting surfaces with every bond. How well it all works depends on exactly how these elements are combined to produce the best possible effect, and that's generally a closely-guarded secret.

    A rule-of-thumb formula to compute the effective contrast ratio of an LCD screen used outdoor is 1 + emitted light divided by reflected light. For emitted light we use the backlight, measured in nits. For reflected light we multiply moderate sunlight, which is the equivalent of about 10,000 nits, with the percentage of light reflected by the display. A normal, untreated notebook screen reflects about 2% of sunlight. A combination of optical coatings can bring that number down to about half a percent for displays without touch screens, and about 0.9% for displays with (resistive) touch screens.

    If you use this formula and plug in the numbers, you find that a standard notebook without any optical treatments has an effective contrast ratio of about 2:1, which means it's pretty much unreadable outdoors. If you boost the backlight or apply optical coatings, the contrast ratio goes up to about 6:1, which is the minimum the military requires in its computers. If you boost the backlight AND apply coating, you get contrast ratios of about 6 or 7 for displays with resistive touch, and up to 11 or 12 without.

    The bottom line with this sort of "modified transmissive" displays is that there are a number of factors that affect the effective contrast ratio and thus display viewability outdoors. It all boils down to the best possible combination of optical coatings and layer manipulations to reduce internal reflection, and a good strong backlight.

    Super-bright backlights

    If that is so, then why not just have the strongest possible backlight? That is indeed one good way to boost the viewability of an outdoor display, but it comes at the cost of either a larger, heavier battery or less battery life. There are ways to minimize the extra drain on the battery, one being a light sensor that will throttle the backlight whenever full brightness is not needed, and another the presence of an easily accessible "boost" button that lets the user turn extra brightness on or off.

    If you're wondering how backlights are measured, an often used unit is nit. The official definition of a nit is that it is "a unit of illuminative brightness equal to one candle per square meter, measured perpendicular to the rays of the source." Most consumer notebooks and tablets are in the 200 nit range, most semi-rugged and rugged device designated as "outdoor-viewable" are in to 500-700 nit range, and some "sunlight-viewable" screens go all the way to 1,000-1,500 nit.

    Does a touch screen impact outdoor viewability

    If it is resistive touch, yes, it usually does make a difference. That's because light reflects on every surface of the LCD assembly, and resistive touch adds additional surfaces, therefore lowering the effective contrast ratio. Another problem is that the soft, yielding surface of resistive touch screens results in distorted reflections that are not present in totally smooth glass surfaces.

    Capacitive touch, which is increasingly used even in rugged devices, doesn't have this problem. However, it always pays to closely examine the display under all all sorts of extreme lighting conditions as some non-resistive technologies can have distracting grid patterns that become visible outdoors.

    Hybrid approaches

    In addition to reflective, transflective and transmissive screens and the various ways to tweak them for better outdoor performance, there are some interesting hybrid approaches. One of them is Pixel Qi's enhanced transflective technology where display pixels consist of a transmissive and a reflective part that have separate drivers. In essence, that allows them to have a "triple mode" display that can use one or both technologies, depending on the lighting situation. A while ago, RuggedPCReview had the opportunity to examine the Pixel Qi technology in a detailed review (see here), and we concluded that the technology absolutely works. However, potential users need to be aware of its inherent gradual switching from full color indoors to muted color outdoors and black and white in direct sunlight as that may affect color-dependent apps.

    Most recently, Pixel Qi has come out with displays that do show colors in direct-sunlight reflective mode, but we have not have hands-on with any of those yet. Also of note is that Pixel Qi's founder and chief technology officer left the company early 2013 to work on the Google Glass project, and that's not good news for Pixel Qi.

    How about OLED?

    What about the OLED/AMOLED technology that is touted as the next big thing in flatscreen TVs, better than either LCD or Plasma? OLED screens have been used in some cameras, phones and even scuba diving computers for a while now, but I can't see them as a candidate for sunlight-viewable displays. That's because OLED technology is essentially a grid of special LEDs that easily washes out in sunlight or even bright daylight.

    Other important considerations

    On top of using the best possible outdoor-viewable display available for a given job, you also want good underlying LCD technology and good optics. A wide viewing angle makes the display easier to read, and we always strongly suggest starting with an IPS (In-Plane Switching, see Wiki) or, better yet, an AFFS (Advanced Fringe Field Switching, see wiki) screen so that viewing angles and color shifts simply aren't an issue. Some anti-glare coatings can create annoying reflections that can trick your brain into filling in detail, which makes the screen even less readable than it would be without the coating. You also don't want a display that reflects distorted images. That again confuses your brain and makes the screen harder to read. And you want a display that is as resistant to fingerprints as possible because fingerprints can become enormously visible and distracting under certain outdoor lighting conditions.

    Examples from RuggedPCReview.com testing

    Below are examples from RuggedPCReview testing that illustrate some of the issues and properties discussed:

    Below: The first image shows pretty much how the modern era of optically treated transmissive displays got started around 2007 when Itronix introduced their DynaVue outdoor-readable display technology. The image shows how DynaVue compared to an earlier non-DynaVue Itronix notebook.

    Below: All of a sudden it was possible to control reflection to an extent where displays remained viewable in direct sunlight. An older Itronix notebook with a matte display surface, by comparison, diffuses the sunlight so much that the screen is totally blown out.

    Below: Unbeknownst to most, Dell was another pioneer with optically treated outdoor-viewable displays. In 2007, a Dell Latitude ATG 630 with its 500-nit backlight offered excellent outdoor viewability with an 11:1 effective contrast ratio. Dell did that by bonding a piece of glass on the polarizer film, thus eliminating the polarizer film's reflectivity, and then treating the smooth exterior surface of the glass.

    Below: Comparison between a "modern-style" optically treated transmissive display on an Xplore rugged tablet, and the matte display on a competing product. Indoors both tablets looked great, but outdoors the difference was obvious.

    Below: The same Xplore iX104 tablet versus one of the original convertible Tablet PCs. The matte, non-treated 2002-era Toshiba Portege display simply vanishes outdoors.

    Below: Matte displays can work quite well outdoors; this DRS ARMOR tablet mutes the reflection and remains quite viewable, though you have to hunt for the right viewing angle.

    Below: That's a Getac rugged notebook on the left versus a Gateway consumer notebook with a glossy display that looked great indoors and even had decent contrast outdoors, but the glossy screen made it unusable.

    Below: A GammaTech SA14 semi-rugged notebook compared to an Apple MacBook Pro. Indoor the MacBook excels, outdoors it falls victim to reflections.

    Below: Here we see how a rugged MobileDemand T7200 xTablet compares to a Google Nexus 7 consumer tablet. Same story: indoors the Nexus screen looks terrific, outdoors it's all reflections, although the display itself remains quite viewable.

    Below: Motion CL910 tablet vs. iPad 3—the iPad's retina display is terrific and is even viewable outdoors, but the super-glossy surface is very prone to reflections.

    Below: MobileDemand pioneered using a modified Pixel Qi display in a rugged tablet, their xTablet T7200. Note how the display works under all lighting conditions, from indoors to direct sun where the display switches to gray-scale reflective.

    Below: An example of the interesting optical properties of treated outdoor displays. The two Motion Computing tablets are both excellent outdoors, and have some of the best displays anywhere, but it's clear that they have different optical treatments.

    Below: Another example of the odd optical properties of some displays. This one is basically quite viewable, but the wavy, distorted reflections make it difficult to use.

    Below: An example of brute force—the Getac X500 rugged notebook has a superbright backlight that, combined with very good optical treatment, makes this one of the best displays available.

    So what's the answer?

    While there are a number of interesting alternative display technologies, at this point, the best overall bet is still a combination of optical treatments and coatings, direct bonding to reduce the number of reflecting surfaces, a reasonably strong backlight, and a touch technology with as little impact on display viewability as possible. The following all contribute to the best currently possible outdoor-viewable display:

    • IPS or AFFS LCD (for perfect viewing angle)
    • Anti-reflective and anti-glare treatments
    • Circular polarizers (combination of a linear polarizer and a retardation film to block reflected light)
    • Minimal number of reflective surfaces via direct bonding
    • Sufficiently strong backlight
    • Hard, flat surface to eliminate distortions
    • Suitably high resolution
    • Touch technology that does not show, distort or create optical aberrations
    • Surface that's not prone to fingerprints

    As is, most major vendors of rugged mobile computing technology offer outdoor/sunlight-viewable displays standard or as an option. Most perform quite well, though there are significant differences that can really only be evaluated in side-by-side comparison, as the industry does not generally disclose exactly how displays are treated. Such disclosure, and also the inclusion of effective contrast ratio into product specs would be tremendously helpful. That, of course, would require generally agreed-on standard definitions and testing procedures.

    The last word in outdoor-viewable display technology has not yet been spoken, and it's more than likely that disruptive new technologies will replace what's available now. However, today's technology has reached a point where it can be good enough to allow work in bright daylight and even direct sunlight, though it definitely pays to see for yourself which particular implementation works best for any given project. -- Conrad H. Blickenstorfer, November 2013

    (For the definite article on the modified transmissive approach, see "GD-Itronix DynaVue Display Technology" by Geoff Walker. It goes back to 2007, but all the principles remain valid today. Thanks also to Mr. Walker for feedback and suggestions for this article).

    Posted by conradb212 at 10:51 PM

    October 23, 2013

    Two annoying trends

    Today I am going to rant a bit about two trends that simply make no sense to me.

    The first is "skeuromorphism." It's the new fashion word-du-jour, what with Apple and Microsoft demonising it as if it were some sort of evil plague. As is, Wiki defines skeuromorph as "a derivative object that retains ornamental design cues from structures that were necessary in the original." That includes, of course, many elements of graphical user interfaces. The desktop metaphor, after all, has been at the very core of every graphical user interface for the past 30 years.

    But now that very quality, to make objects on a computer screen look just like the real thing, has come under heavy attack. And that even includes the three dimensional nature of the real world. Apple, especially, but also Microsoft, now want everything to be flat. As flat as possible. Anti-skeuromorphism forces argue that the public has now been exposed to computers long enough to no longer need the analogy to real, physical things. And so, in the eyes of many, the latest versions of many operating environments, and Apple's iOS, look dreadfully flat and barren indeed.

    In Apple's case, one could well argue that a bit of a backlash against skeuromorphic excess was probably a good thing. Apple, long the champion of good design, had begun slipping with some truly kitschy stuff, like the "leather-bound" address book, the Ikea-style wooden bookshelf and other affronts that likely would have had a healthy Steve Jobs froth at the mouth. So a bit of streamlining things was in order, but when I look at the sad, sparse, flat expanse and eensy-tiny lettering that now mars the iOS and many of its apps, the sad icons that look like they just want to vanish from view, and the rest of the bleakness that now adorns iPhones and iPads, I wish Jonathan Ive and colleagues would have stuck with hardware.

    You could argue, of course, that after decades of visual improvements and fine-tuning, the anti-skeuromorphism crusade simply rings in the advent of fashion in electronic interfaces. Just like fashion goes into extremes just to then denounce the trend and swing into the other extreme (neatly obsoleting billions of dollars worth of product in the process), perhaps we'll now have to put up with anemic, flat computer and tablet screens until the trend reverses and everything becomes three dimensional and lively again.

    Okay, the second trend is that to thin and slender hardware design at all cost. The just announced new Apple iPad Air is hailed as a wondrous achievement because it's thinner yet and weighs even less. It's a veritable Barbie of a tablet. And since this is Apple, and Apple decreed some years ago that computing devices need to be rectangular and very flat, we now have hyper-slim smartphones, hyper-thin tablets, and even hyper-thin iMacs, which in the latters' case makes absolutely no sense since they sit on a desk in front of you. And we also have hyper-thin HDTVs. Size is okay as we now have smartphones with screen sizes approaching 6 inches and flat screen TVs approaching 90 inches. But it all must be very flat and thin.

    Why?

    I mean, making that technology so very flat simply makes it more difficult to design and manufacture, and since hardware happens to be a physical thing it often loses utility if it's pressed into too flat of a design (the new iPad Air's battery is down to 32.9 WHr, vs. the iPad 4's 43 WHr). The dreadful sound of flat-screen TVs is a prime example, and the so-so battery life of many super-slim smartphones another. More and more the trend to supreme thinness seems more a narcissistic quest to prove that one's technology is so advanced that mere physicality no longer matters. Sort of like a supermodel starving herself into a skeletal, gaunt appearance just to be lauded for her discipline and elegance.

    It makes no sense. I mean, the latest Crossover probably weighs almost 5,000 pounds, a house weighs an awful lot, and American people weigh more all the time, too. So why is ultimate thinness in an electronic device such a virtue?

    And it especially makes no sense for rugged devices where the very physicality of the design provides the structure and toughness to make it last on the job. And where a degree of volume means it'll run cooler and provide space for expansion and versatility. Yet, even rugged device are getting thinner all the time. They have to, or the public, even customers in enterprise and industrial markets, will stay away.

    So there, two silly trends. And trends they are, as you can't keep making physical stuff thinner beyond a certain point. Once that point is reached, the quest is over, and the pendulum will reverse or go elsewhere. It's quite possible that the next Steve Jobs will some day present the latest super-gadget, and it will be egg-shaped. It's possible.

    Be that as it may, I just hope that technology will stay as free from fashion dictates as possible. Because once it takes fashion to sell gear, that means innovation is over.

    Posted by conradb212 at 8:21 PM

    October 18, 2013

    Rugged Android device comparison table, and contemplations over Android in the rugged market

    On October 18, 2013, RuggedPCReview.com launched a rugged Android device comparison table. The table allows interested parties to view full specifications of all rugged handhelds and rugged tablets we're aware of.

    Given the absolutely massive number of Android devices activated worldwide -- about a billion -- it's amazing how few rugged Android devices are available. As we recently reported, both Honeywell/Intermec and Motorola Solutions have launched initiatives to make rugged Android devices available to industrial and enterprise markets, and other manufacturers are offering ruggedized Android-based handhelds and tablets as well. But there aren't many actual devices, probably less than a couple of dozen in both categories combined. And even that small number includes products that are available with either Android and one version of Windows Mobile or another, which means they aren't really optimized for either.

    Add to that the fact that few of the available Android-based rugged devices are on the latest, or even a recent, version of Android, and that much of the hardware isn't anywhere near the technological level of consumer smartphones and tablets, and one has to wonder all over again why Android has such a terribly hard time to get going in rugged/industrial devices.

    On Microsoft's website you'll find a white paper entitled "Choose Windows Mobile Over Android for Ruggedized Handhelds" written by Gartner in February 2011 (see here). Among the key recommendations there were to "remain with Windows Mobile for ruggedized handheld-computer solutions, and to prepare for a transition to full Windows in subsequent implementations" and to "limit the scope of Android-based ruggedized application development through 2013." Of course, the two and a half years since Gartner issued the paper is an eternity in mobile electronics. At the time they still mentioned Android as the #2 smartphone OS behind Symbian!

    Gartner also cautioned that the lack of FIPS-140 compliance (FIPS 140 is a U.S. government computer security standard that specifies cryptographic requirements) was an issue for Android, and they predicted that enterprise software vendors would probably create HTML5-based client applications with cross-platform abstraction layers to provide some support of Android devices. FIPS-140 compliance was indeed an issue with Android, and one that's even now still only addressed on a by-device level. Cross platform application development is now available via platforms such as RhoMobile and iFactr.

    I don't know how widely read Gartner's 2011 report was, but over the past three years the rugged computing industry certainly heeded Gartner's advice of choosing Windows Mobile over Android for ruggedized handhelds. Gartner's 2011 arguments made sense, but probably even Gartner didn't foresee that the installed base of Android devices would grow from under 200 million back in 2011 to a cool billion today. One could easily argue that playing it safe with Windows Mobile precluded participating in the rapid, massive learning curve with Android over the past two or three years.

    There are no good answers, and hindsight is always 20/20. Except that even two or three years ago it was quite obvious that Windows Mobile was doomed, and Microsoft did not seem to have a compelling roadmap in the mobile space. In historic terms, the predicament the rugged handheld and tablet manufacturers have been facing over the Android issue is no less harrowing than the predicament they faced a decade and a half ago when there was increasing pressure to abandon their various proprietary operating platforms in favor of Windows CE.

    What's the answer? Hard too say. It is impossible to ignore a user base of a billion and counting, because that billion already knows Android and how it works. On the other hand, Android's fragmentation is vexing, there remain questions about platform security (see overview of Android security), and the fact that Android' was as clearly designed for capacitive multi-touch as Windows was for a mouse makes it less than perfect for gloves and wet places. At this point it is also still possible that Microsoft might somehow pull a rabbit out of its hat with Windows Embedded 8 Handheld, causing a percentage of the rugged mobile market to go one way and the consumer market another. Remember that the Palm OS once dominated the mobile OS market to an extent where Symbol (now part of Motorola Solutions) had a Palm-based industrial handheld (see here) before the tide went the other way.

    Posted by conradb212 at 6:28 PM

    October 2, 2013

    October 1, 2013 -- the day Moto Solutions and Honeywell/Intermec became serious about Android

    This has been quite the day for Android in the rugged handheld space.

    Intermec, now part of Honeywell Scanning & Mobility, announced the CN51 rugged mobile computer. It is an updated version of Intermec's successful CN50, but has a larger, higher resolution screen (4.0-inch, 800 x 480) that now uses resistive multi-touch, updated WiFi, WAN, Bluetooth, camera, and scanners, and it's now based on the dual-core 1.5GHz TI OMAP 4 processor, which means Intermec can offer the CN51 either with Microsoft Windows Embedded Handheld 6.5 OR with Android 4.1.

    And Motorola Solutions, the same day, announced that three of its popular enterprise mobile computers would now be available with Android Jelly Bean, fortified with Moto's own Mx security, device management and performance features that they added to the standard Android OS. So as a result, the following three Motorola Solutions devices will now be available with Android Jelly Bean:

    Motorola Solutions Android mobile computers
    Product ET1 MC40 MC67
    Type Enterprise tablet Rugged Handheld Enterprise PDA
    Display 7-inch (1024 x 600) 4.3-inch (480 x 800) 3.5-inch (480 x 640)
    Digitizer Capacitive multi-touch Capacitive Resistive
    Original OS Android Android Embedded Handheld 6.5
    Available OS Android Jelly Bean Android Jelly Bean Android Jelly Bean
    RAM 1GB 1GB 1GB
    Storage 4GB Flash + microSD 8GB Flash 8GB Flash
    Size 8.8 x 5.1 x 1.0 5.7 x 2.9 x 0.8 6.4 x 3.0 x 1.3
    Weight 1.4 lbs. 9.4 oz. 13.5 oz.
    CPU 1GHz OMAP 4 800MHz OMAP 4 1GHz OMAP 4
    Scanning via 8mp camera SA4500-DL SE4500-SR/DL/DPM
    WWAN Data-only NA Voice and data
    Op. temp 32 to 122F 32 to 122F -4 to 122F
    Sealing IP54 IP54 IP67

    Motorola Solutions points out that their three Android offerings are not only benefitting from the Mx extensions (see Mx website), but also from the company's RhoMobile Suite (see RhoMobile page), a cross-platform development tool used to create apps that are OS-independent. Which means the Moto Android devices can run standard Android apps, or HTML 5 cross-platform apps created with RhoMobile.

    Here's what Motorola Solutions had to say about the emerging Android devices:

    And below is Intermec's introduction to their CN51 that can run both Windows Mobile and Android:

    What does it all mean, this pretty high visibility push of Android in business and enterprise class devices? Well, there's the odd situation that while almost all smartphones are either iPhones or Android devices, virtually all industrial handhelds still run one version of Microsoft's old Windows CE or Windows Mobile or another. Which means that most industrial handhelds are by no means ruggedized equivalents of the smartphones a good part of humanity already knows inside out. Which results in extra training, an extra learning curve, and the near certainty that change will come anyway. Up to now, rugged mobile computing manufacturers have been remarkably conservative in acknowledging that trend, and generally limiting themselves to offering an exploratory Android version or two of a Windows device running on similar hardware. What we're now beginning to see is the next step, that of making the hardware so that it can run both old and new.

    Now, no one wants to alienate Microsoft, of course, and so things are worded carefully. Intermec's press release includes a quote from industry analyst David Krebs, VP of mobile & wireless at VDC: "While rugged devices are designed to be more function or application-specific than smartphones, there is growing consensus that these devices deliver a similar immersive experience and have similar capabilities as function-rich smartphones. As Android matures in the enterprise it represents an increasingly viable option for rugged vendors such as Intermec to bridge this functionality gap and deliver the capabilities their partners and customers are looking for."

    On the Motorola Solutions side, Girish Rishi, Senior VP of enterprise solutions, said, "Now, businesses have more choices and can better manage rapid changes in the market by using Motorola’s tools that deliver complete solutions in less time while protecting their mobile investments.”

    It's fairly safe to assume that these are just first steps. The proposed hardware still represents compromises and is not (yet) truly Android optimized. But the message communicated by both Intermec/Honeywell and Motorola Solutions is quite clear: We can't wait any longer, Microsoft. We need to get with the program before we lose our markets to consumer smartphones in a case.

    Posted by conradb212 at 12:09 AM

    August 11, 2013

    Optimizing the legacy Windows interface for touch and tablets

    Tablet computers have been around for a quarter of a century, but it wasn't until the iPad's introduction that the tablet form factor took off. That's in part because the technology wasn't quite ready for tablets, and, in a much larger part, because Windows just didn't work well with tablets. Tablet historians will remember that both the first generation of tablets (circa 1992) and the second one (circa 2001) primarily used Wacom active digitizer pens. Those pens were (and are) precise enough to operate the Windows user interface, especially when aided by special Windows utilities and accommodations (Windows for Pen Computing back in 1992, and the Windows XP Tablet PC Edition in 2002).

    Ever since the iPad came onto the scene, the market has been demanding touch instead of pens. Touch works great with operating environments designed for touch, such as iOS and Android, but not nearly as well with Windows. As of late, we've seen a number of new Windows tablets that use either resistive touch or even projected capacitive touch. Resistive works best with a stylus, though a firm finger touch usually also works, albeit not very precisely. Capacitive touch and legacy out-of-the-box Windows, however, are not a great match (except for the new Metro environment in Windows 8).

    There's not much that can be done to remedy that situation. Many people want and need Windows on a tablet, but in order for it to work like Windows, it needs the full Windows user interface, and that is simply not designed for pen and touch. As a result, if you work on a tablet with a small screen and limited resolution, your screen may look like it does in the picture below:

    The small scrollers, check boxes, menus and text work perfectly well if the tablet is used with a mouse, but they are almost impossible to use with touch.

    Fortunately, there are ways to improve the situation, if only to some extent. And that is done through customization of the Windows interface. Here's how it's done.

    In Windows 7, go to Control Panels, then select Personalization. Go to the bottom of the screen and click on Window Color. At the bottom of that window, select Advanced Appearance Settings... What then shows up is the Window Color And Appearance tab that has been around for many years. It lets users adjust the size of numerous Windows interface items.

    The "Item" pulldown provides access to:

    3D Objects (lets you select the color)
    Active Title Bar (select size and font)
    Active Window Border (lets you select the color)
    Application Background (lets you select the color)
    Border Padding (select size)
    Caption Buttons (lets you select the color)
    Desktop (lets you select the color)
    Disabled Item (lets you select the font color)
    Hyperlink (lets you select the color)
    Icon (select size and font)
    Icon Spacing (Horizontal) (select spacing)
    Item Spacing (Vertical) (select spacing)
    Inactive Title Bar (select size, color and font)
    Inactive Window Border (select size and color)
    Menu (select size, color and font)
    Message Box (select font, font size and color)
    Palette Title (select size, font and font size)
    Scrollbar (select size)
    Selected Items (select size, color, font, font size and color)
    ToolTip (select color, font, font size and color)
    Window (select color)

    Here's what it looks like:

    The default values for all of those items are, surprise, what works best for a desktop computer with a mouse and a large screen. Problem is, those default desktop/mouse settings are also what virtually all Windows tablets come with. It's as if vendors thought Windows worked equally well on any size and type of platform, which it definitely does not.

    As a result, Windows, which isn't very suitable for tablets in the first place, is even worse with the desktop/mouse default settings. So how are we going to remedy that situation? Not easily, but it can be optimized to work better on touch tablets.

    Colors, obviously, don't make a difference. So let's forget about all the interface items where you can only change color. Icon size and spacing is also pretty much a matter of choice, as is font color, so let's drop those as well. That leaves:

    Active Title Bar (select size and font)
    Border Padding (select size)
    Inactive Title Bar (select size, color and font)
    Inactive Window Border (select size and color)
    Menu (select size, color and font)
    Message Box (select font, font size and color)
    Palette Title (select size, font and font size)
    Scrollbar (select size)
    Selected Items (select size, color, font, font size and color)
    ToolTip (select color, font, font size and color)

    Inactive items also don't matter, and neither does stuff you don't actually interact with, so let's drop those. That leaves:

    Active Title Bar (select size and font)
    Border Padding (select size)
    Menu (select size, color and font)
    Palette Title (select size, font and font size)
    Scrollbar (select size)
    Selected Items (select size, color, font, font size and color)

    Now we get to what matters. Here's why and how:

    Active Title Bar -- contains the three all-important boxes that minimize a window, maximize it, or close it. Those are used all the time. They must be large enough to easily operate with touch or a stylus. (On a 7-inch 1024 x 600 pixel display, I set title bar size to 32 and font size to 11).

    Border padding -- this one is annoyingly important. It's important because your finger or stylus must connect with the border to resize it. It's annoying because a border large enough to easily manipulate is an eye sore and takes up too much space, making Windows look clumsy.

    Menu -- not really, really important, but you'd like to be able to see the menu choices, so make them large enough. I used 32 for size, and 11 for font size.

    Palette Title -- honestly, I don't know why I left this one in there. I set it to 32 size and 11 font size.

    Scrollbar -- perhaps the most important one of all. If you need to scroll up and down and left and right with your finger or a stylus, it MUST be large enough to easily touch. I made it 32, and the font 11.

    Selected items -- that's the choices row when you select something from a menu. It needs to be large enough to read and select from via touch. Made it 32 and the text 11.

    So there. Once you've done this, Windows won't be that terribly difficult to operate with your fingers or a stylus. It's not going to look very pretty, and you'll use up far too much valuable screen real estate with interface items designed for a mouse, and now resized to make them work as well as they possibly can with touch.

    And here's what it looks like. Note the much larger scrollers, menus, window borders and text:

    Another good thing to know is that you can save those settings as themes (see image below). Which means you can quickly toggle between a theme that's optimized for use with a mouse (or when the tablet is hooked up to a larger external screen), and when it is used as a tablet with finger touch.

    Can that be done in Windows 8 as well? Not as easily. The "Advanced Appearance Settings" is gone from the Windows 8 (legacy) Control Panel. To change the interface, users can change text size in the Display control panel. But everything else can only be adjusted with the Registry Editor, which is not for the faint or heart (see how it's done).

    Posted by conradb212 at 3:48 PM

    June 25, 2013

    Logic Supply's logical approach to engineering their own systems

    When it comes to rugged computing gear, most people interested in this industry know the big players that dominate the market and get all the media coverage. But that's not everything there is. Unbeknownst to many outside of the circle of customers and prospects, a surprising number of smaller companies are designing and manufacturing rugged computing systems of one type or another. At times we come across them by chance. Other times they find us.

    And so it was with Logic Supply, located in South Burlington, a small town in the northwestern part of Vermont. They call themselves "a leading provider of specialized rugged systems for industrial applications," and asked if we could include them in our resource page for rugged system vendors. We could. And that led to some back and forth where I learned that while Logic Supply distributes a variety of rugged/embedded systems and components, they have also begun developing their own high-end chassis under their own LGX brand. That was interesting, and so we arranged an interview with Rodney Hill, Logic Supply's lead engineer, on the company, and how they go about creating their own, home-developed solutions in addition to being a distributor of products engineered elsewhere.

    RuggedPCReview: If you had to describe Logic Supply’s approach to case and system engineering in a minute or less, what would you say?

    Rodney Hill (Logic Supply): So, LGX is Logic Supply’s engineering arm. The design approach for LGX systems and cases can be boiled down to three ideas. First is our “designed to be redesigned” philosophy. Seed designs that are scalable and modular. From a seed idea we can create a product line through swappable front- and back-plates or resized geometry. Second is mass customization — by using standardized screws, paints, sheet metal folds, and design concepts, we leverage mass produced hardware whenever possible to keep the cost low. And through our modular designs we customize the off-the-shelf by using “upgrade kits,” which are quick to source and are cost-effective. Finally, innovation, not invention! There is a difference. Add value to things that work well, but do not re-invent the wheel.

    Was that under a minute?

    RuggedPCReview: Almost. But now let’s expand. You said scalable, modular, and “designed to be redesigned.” What do you mean by this?

    Rodney Hill (Logic Supply): Designing a new chassis is four to five times the price of redesigning a seed design. Much of the time wasted in projects is done so while selecting paints, screws, boxing, foam, metal fold designs, etc. By using standardized design methods and seed concepts, our team can immediately start adding value. Ultimately the customer is only paying for the design and not the time the engineers spent trying to get their act together. We will be faster and more focused on quality and containing costs and risk.

    So, to your question, designed to be redesigned systems from LGX have already incorporated flexible features to accommodate 80% of the customizations that customers request with off-the-shelf hardware. The last 20% are resolved with ‘upgrade’ kits that will be included with the off-the-shelf chassis kit. But you’re also using the proven benefits of the rest of the chassis (EMI and RFI shielding, for instance), and only adding risk in small portions. Meaning the rest of the chassis is still meeting all the same design criteria it was originally intended to support. So you can easily customize it without the risk of any negative effects on any of those features.

    In terms of scalability versus modularity — there are design themes seen in our cases. If you look carefully enough, you can begin to see connections between designs. The NC200 and the MK150 are two totally different designs – however they share about 80% of the same DNA, from vent holes to metal folds, etc.

    RuggedPCReview: How does cooling play into ruggedization?

    Rodney Hill (Logic Supply): Nature always wins. Meaning dust and water will destroy everything if given the chance. You need to decide how long the computer needs to live, and how much you’re willing to pay for it. Heat will shorten the life of components.

    So in terms of chassis design concepts: Keep the chassis cool as possible and as quiet as possible. Intelligent design is required to incorporate standardized cooling methods and proven airflow paths to cool many types of devices. Fan diameter, placing, vent design all will have effects on the acoustic design as well. Logic Supply will engineer noise out of systems with fan-muffling technologies, maximizing air throughput with smaller, more simple fans by identifying inefficiencies in orifice design. In short, having a fan against a grille will kill 50-60% of the airflow and multiply the noise by two or three times.

    Vent holes equal dust. Dust causes fans to break, which in turn results in hot computers. Eliminate vents and go fanless. The operational temperatures and ruggedness greatly increase. Logic Supply defines “fanless” different than the IMP mass market. Our definition is not simply “no fans.” It is more than that: no fans, no vents for dust and dirt, and maintain the ability to cool the computer system at 100% duty load for hours and days at a time. We want these systems to be heavy duty, and also to be able to last a long time. It is rated for high performance!

    RuggedPCReview: Can you talk about the design process? How long does it take from start to finish?

    Rodney Hill (Logic Supply): It happens pretty fast. This year we’ve done a fanless Mini-ITX case, a 1U rackmount case, a NUC [Next Unit of Computing] case, and we’re finalizing a fanless NUC case right now. We’ve also finished a number of customer-specific designs. These design concepts typically originate in sales — you know, this customer wants to do X and none of our existing solutions do it. But because we use seed designs, we don’t start from scratch. It really all depends, but usually designs take under three weeks, and prototypes are ready a few weeks after that. We review, test, and modify, then we’re typically getting production units in-house around five or six weeks after that.

    These core platforms can then be sold off-the-shelf, or customers can either go the semi-custom route or more radically modify the design. For simple modifications (like back-plates, front-plates, and simple changes) maybe one to five days in design and a three to four day lead on parts. For customized chassis design with samples, five to six weeks, and four to six after that to mass production.

    RuggedPCReview: Alright, finally, can you give us an example of a successful customer product development?

    Rodney Hill (Logic Supply): Sure. Last year we worked with a company called StreamOn to make a custom appliance with off-the-shelf components. StreamOn offers streaming audio solutions for the radio broadcast industry. The hardware they were using at the time was going End-of-Life, and they also needed a more specialized embedded system because their business was growing and they wanted to offer more features to their customers. They needed a variety of other things — outsourced fulfillment and things like that — but from an engineering perspective it was mostly that — the EOL and specialization. And all while remaining affordable for their customers. We worked from an existing system design — the ML250 — and customized it toward what they needed. We added an SSD, LCD screen and multifunction buttons, and on-case branding.

    Ultimately, the system we created was something like 30% smaller, and it was fanless, so it was more efficient, and had a longer life expectancy. It also had a built-in watchdog timer and auto restart bios so it could avoid any complications related to sudden power outages, etc. And it actually ended up being even less expensive for their customers than what they were previously offering. So that all worked out quite well. In fact, they recently won the [Radio World] “Cool Stuff Award,” which was pretty, well, cool!

    This whole process was consistent with our typical design timeline, by the way. From the initial conversations to mass production — with samples and prototyping — we took about three months.

    Posted by conradb212 at 3:16 PM

    June 24, 2013

    Why the JTG Daugherty NASCAR racing team chose rugged Dells

    The Christmas tree began its count-down. Yellow, yellow, yellow, GREEN! For an anxious moment, the racing slicks of my supercharged Acura fought for traction, then bit. 8,000, 8,500, 8,800 rpm, shift. Shift. Shift. Shift, and the 1/4-mile at Sacramento Raceway was over. The car slowed and I reached over to stop data logging on the laptop securely sitting on its mount, just having recorded tens of thousands of data points as the car shot down the track. The laptop was a Dell Latitude ATG 630D, connected via USB to the Hondata ECU under the dash of the car. Minutes later I would analyze the run on the Dell, temperatures, shift points, slippage, air/fuel ratio, knocks, timing, etc., and then make changes on the fly. The next heat was in less than 15 minutes.

    At the time I didn't know that a few years later I'd be talking with the JTG Daugherty NASCAR racing team about how they used rugged Dell laptops on their #47 Sprint Cup car, driven by NASCAR legend Bobby Labonte. Labonte won the Cup in 2000 during an era where he was a perennial contender. And also won IROC in 2001, following in the footsteps of his brother Terry Labonte, also an IROC and Cup champion. Now a senior amongst NASCAR drivers at age 49, Labonte's piloting car #47 for the team owned by Jodi and Tad Geschickter, and NBA Hall of Famer Brad Daugherty. Lady Luck hasn't been too kind to them this season, but that's certainly not due to this talented group and also not due to the technology they're using. Most recently, while Martin Truex Jr. won at Sonoma Raceway in his Toyota Camry, a blown oil cooler ended Labonte's race in essentially the same Camry before it even began. Those are the breaks.

    So I felt almost guilty when I got on the phone Monday morning after that race with Matt Corey, who is the IT administrator at JTG Daugherty Racing, and Dell's Umang Patel and Alan Auyeung to discuss JTG Daugherty's use of Dell technology. Corey in particular, probably didn't feel too good after the frustrating weekend and had plenty of other things to do at their shop, but he took the call. Much appreciated.

    So how is JTG Daugherty Racing using Dell computers? And what made them decide to use Dell from the driver to the garage and the pit crew to the data center with a complete suite of Dell technology and solutions that also includes rugged Dell ATG and XFR laptops? The choice of Dell for data center and office isn't much of a surprise, given that Dell has consistently been in the top three PC vendors worldwide and in the US. What's more interesting is that JTG Daugherty also chose Dell for their rugged laptops, a field dominated by Panasonic, Getac and a number of other vendors specializing on rugged equipment.

    Corey began by explaining the inherent need of a NASCAR racing team for rugged technology. No surprises here. There's rain, dust, vibration, extreme temperatures, the whole gamut of hazards rugged mobile computing gear is designed and built to survive. Add to that the extreme time crunch as a race car is tested and prepared, the extreme need for absolute reliability in a sport where fractions of a second matter, and a race car is checked, refueled and has all of its tires changed in something like 13 seconds. Things simply must not go wrong, ever, in such an environment, and racing teams certainly cannot put up with finicky computing technology that may or may not be up to the job. As an example, Corey tells of an incident where a consumer laptop simply wasn't able to handle vibration, causing them a lot of grief.



    So as a result, JTG Daugherty now uses rugged gear. Their race engineering team has Dell Latitude E6430 ATG laptops. The ultra-rugged Dell Latitude X6420 XFR is used on the truck and trailer. They also use Windows-based Dell Latitude 10 tablets in Griffin Survivor cases supplied by Dell. All of this means that the team can collect performance stats, analyze it, and make changes quickly and reliably. "We have connectivity everywhere," said Corey. "As the car chief makes a decision about a change to the car, for example, he now notes this on his Latitude 10 and the information is instantly communicated to everyone across the organization. All decisions from the car chief trickle down to updates to the racecar and with everyone synced together with tablets and other Dell technology, that information flow is now much faster, more reliable and more efficient."

    But still, why Dell for the rugged gear? Here I expected Corey to point to the advantage of dealing with a one-stop vendor. Instead he says that they had used Toughbooks in the past and liked them, but that "they really didn't change much over the years, same always," and that Dell updates more frequently. "We don't want "plain vanilla," he said, "we want to be on the cutting edge of technology" and lists the memory and processing speed required to power through race simulations, high resolution imaging, and massive data sets.

    Staying at, or near, the leading edge in technology while still adhering to the longer purchase cycles and life spans of rugged equipment, and guard against obsolescence of docks, peripherals, accessories and software has always been a challenge for the rugged computing industry. While Intel tends to unveil a new processor generation and ancillary technology every 12 to 18 months, the rugged industry cannot possibly update at the same pace. Even Dell is not immune in that regard; as of now, the rugged XFR laptop is still based on the E6420 platform.

    Yet, Dell does have the advantage of very high production volume and with that comes access to the very latest technology. Combine that with the convenience and peace of mind of dealing with a large one-stop shop, and it's no surprise that even a NASCAR racing team chose Dell.

    See NASCAR Team Selects Dell to Speed Past the Competition, Dell's Latitude for Rugged Mobility page, and RuggedPCReview.com's most recent reviews of the Dell Latitude ATG and Dell Latitude XFR.

    Posted by conradb212 at 7:47 PM

    May 28, 2013

    Rugged notebooks: challenges and opportunities

    I've been working on setting up our new rugged notebook comparison tool over the past few days. So far, the tool, where users can compare the full specs of up to three rugged notebooks side-by-side and also quickly link to our analysis of the machines, has far fewer entries than our comparison tools for rugged handhelds and rugged tablets. As I asked myself why there were only relatively few products out there, I thought about the overall rugged notebook situation.

    A little while ago I came across a news brief by DigiTimes, the Taipei-based tech news service that's always interesting to read (albeit not always totally accurate). The news item was about Getac gunning for an increased market share in rugged notebooks. Digitimes said the current worldwide rugged notebook marketshare situation was something like Panasonic having 60%, Getac and General Dynamics Itronix each about 12.5%. They didn't specify the remaining 15%, but it's obviously a number of smaller players.

    That news came just a short while after General Dynamics officially pulled the plug on Itronix, so those 12.5% that used to be GD-Itronix rugged notebooks such as the GD6000, GD8000 and GD8200, are now gone and up for grabs. Who will step up to bat? Will Getac take over what GD-Itronix used to have? Or will Panasonic's Toughbooks get even more dominant? Or will perhaps someone else emerge?

    There's no easy answer. And the market is a rather fragmented one. First, it's not totally clear what makes a notebook "rugged." In the business we generally differentiate between "rugged" and "semi-rugged," where the more expensive fully rugged devices carry better sealing and are built to handle more abuse than semi-rugged models that offers somewhat less protection, but usually cost and weigh less in return. But rugged and semi-rugged are not the only designations you see in the market. Some manufacturers also use terms such as "business-rugged," "vehicle-rugged," "durable," or even "enterprise-rugged." There's also "fully-rugged" and "ultra-rugged."

    Of machines on the market, we'd consider products such as the Panasonic Toughbook CF31, Getac B300 or GD-Itronix GD8200 as rugged, and the Panasonic Toughbook 53, the Getac S400 and the GD-Itronix GD6000 as semi-rugged. But then there are also notebooks specifically made for enterprise and business that are better made than run-of-the-mill consumer notebooks, but somehow defy definition. Examples are the very light magnesium notebooks by Panasonic that cost a lot more than any regular laptop and can take much more abuse, but do not look tough and rugged.

    Then there's yet another category of laptops that are almost exclusively used for business and vertical market applications, and that's the convertible notebooks. These had their origin when the industry was intrigued by tablets in the early 1990s and then again in the early 2000's, but wasn't quite sure if customers would accept them, so they made something that could be used both as a tablet and as a laptop. These usually cost more than notebooks, and were heavier than tablets, but somehow the concept is still around, and there are many models to choose from. Some are fully rugged, such as the Getac V100/V200 and the Panasonic Toughbook 19, others are semi-rugged like the Panasonic Toughbook C2, or business-rugged, such as Lenovo ThinkPad X230t or the HP EliteBook 2760p.

    Yet another category is rugged notebooks that are based on large volume consumer notebooks. Examples are the semi-rugged Dell Latitude ATG and the fully rugged Dell Latitude XFR. With Dell having quick and easy access to all the latest technology, the ATG at least is almost always at, or close to, the state-of-the-art in processors and other technology.

    And there further twists. While the likes of Panasonic and Getac make their own notebooks, a good number of others are made by a small handful of OEMs under exclusive or (more often) non-exclusive agreements with resellers that put their own brand names and model numbers on the devices. Taiwanese Twinhead, for example, had a longstanding relationship with the now defunct General Dynamics Itronix, with some models exclusive to Itronix and others marketed by various vendors through different channels. That can make for interesting situations. While Twinhead was and is an important OEM, they also sold their mostly semi-rugged lineup under their own name and the Durabook brand, and also through their US subsidiary GammaTech.

    But there's more. A number of smaller players, or small parts of larger industries, provide highly specialized rugged notebooks that are often so unique as to only target very narrow markets. Some machines are built specifically to the requirements of military and other government contracts. Their names and brands are usually unknown to anyone outside of the small circle of targeted customers.

    Why are there so few rugged and semi-rugged notebooks? One reason is that the market for them isn't all that large. They are popular in police cars and similar applications, and wherever notebooks simply must be much better built than fragile consumer models. Another reason is price. Even relatively high-volume semi-rugged laptops cost two to three times as much as a similarly configured consumer model. Rugged notebooks run three to five times as much, and specialized models may be ten times as much.

    By and large, the rugged computing industry has been doing a good job educating their customers to consider total cost of ownership as opposed to looking only at the initial purchase price, but it's not always an easy sell. And with handy, inexpensive tablets flooding the market, it isn't getting any easier. Add to that the fact that makers of rugged notebooks always had a special millstone hanging around their necks, that of having to make sure that products stay compatible with existing docks, peripherals and software. That often prevents them from adapting to new trends and switching to newer technologies and form factors (like, for example, wider screens) as quickly as some customers demand. While it's certainly nice to see Intel coming out with a new generation of ever-better processors every year or two, it's not making it easier for rugged manufacturers to stay current in technology and features either.

    As is, if Itronix really had a roughly 12.5% market share, that slice of the pie is now up for grabs and it should be interesting to see who ends up with it.

    Posted by conradb212 at 3:24 AM

    May 17, 2013

    How Motorola Solutions made two mobile computers condensation- and freezer-proof

    Good phone conversation today with the PR folks from Motorola Solutions. The occasion was the introduction of two interesting new products, the Omni XT15f industrial handheld, and the Psion VH10f vehicle-mount computer. The key here is the "f" in both of the names. It stands for "freezer" and that's what the two new devices are all about. Big deal?

    Actually, yes. At least for workers who use their computers in and around freezers. That includes storage of perishable foods, the strictly temperature-controlled environments where medications are stored, and numerous other places for goods that need to be or stay frozen. So what's the issue? You just get devices that can handle the cold and that's it, right?

    Yes, and no. While the environmental specs of most rugged computing devices include their operating temperature range, the range only tells the temperatures within which the device can be used. In the real world, and particularly when working around freezers, temperature alone isn't the whole issue. What matters is how a device can handle frequent, rapid changes in temperature. The real enemy then becomes condensation, and not so much temperature. Extreme temperatures remain an issue, of course, but one that must be addressed as part of the larger issue of rapidly changing temperatures.

    So what exactly happens? Well, if you go from a hot and humid loading dock into a freezer, the rapidly cooling air in and around a device loses its ability to carry moisture, which then becomes condensation. That condensation then freezes, which can cause frost on displays, rendering them illegible, frozen keys on the keypad, and possibly internal shorts. When the worker leaves the freezer environment, the frost quickly melts, again affecting legibility of the display and possibly causing electrical shorts. It's quite obvious that extended cycling between those two environments not only makes the device difficult to use, but it's almost certainly going to cause damage over time.

    Now add to that the slowing down of displays in extreme cold and the general loss of battery capacity, and it becomes obvious why this is an issue for anyone using mobile computers in those environments. And hence the new "freezer" versions of those two Motorola Solutions products (Omnii XT15f on the left, Psion VH10f on the right).

    So what did Motorola do? Weber Shandwick's ever helpful Anne Norburg suggested I talk to the source and arranged the call, and so I had a chance to ask Amanda Honig, who is the Product and Solutions Marketing Manager for Enterprise Mobile Computing, and Bill Abelson of Motorola's media team. The overall challenge, they said, was to provide reliable "frost- and condensation-free" scanning. In order to do that, they had to address a number of issues:

    Since the scan window can fog up, they used internal heaters to automatically defog the window, thus facilitating accurate scans under any condition.

    Since external condensation can quickly freeze around keys and make the keypad difficult or impossible to operate, they designed special freeze-resistant keyboard layouts with larger and more widely spaces keys.

    Since the airspace between the LCD display and the touchscreen overlay can fog up from condensation and make the display unreadable and imprecise to operate, they optically bonded layers to eliminate air spaces and used a heater to eliminate internal display fogging.

    Since battery capacity tanks in very low temperatures and standard batteries can get damaged, they used special low temperature batteries with higher capacities and minimized performance loss at low temperatures.

    And to make sure this all worked transparently and without needing any worker involvement, they included environmental sensors and heater logic circuitry so that the device automatically handles the rapidly changing temperatures and humidity. There are, however, also ways to do it manually.

    Finally, since it makes no sense to overbuild, they offer two versions. One is called "Chiller" and is considered "condensation-resistant," with an operating temperature range of -4 to 122 degrees Fahrenheit. The other is called "Arctic" and is considered "condensation-free." That one can handle -22 to 122 degrees. The Chiller and Arctic versions add US$700 and US$1,100, respectively, to the cost of the basic Omni XT15 handheld computer. If it means fewer equipment hassles when getting in and out of freezers, that's a small price to pay.

    There's another interesting aspect to all this. Changing and upgrading existing equipment is never easy, but in this case it was made easier because Psion, even prior to its acquisition by Motorola Solutions, had given much thought to modular design as a means to quickly and easily adapt to special requirements, easier maintenance, and also to future-proofing. At the very least this means much of the repairs and maintenance can be done in the field. And I wouldn't be surprised if it also made it easier to come up with these special versions

    Posted by conradb212 at 11:19 PM

    May 14, 2013

    Handheld: Pursuit of a vision

    I had a chance yesterday to meet over dinner with Sofia Löfblad, Marketing Director at Handheld Group AB, and Amy Urban who is the Director of Marketing at Handheld US. I hadn't seen them since I presented at the Handheld Business Partner Conference in Stockholm three years ago, and it was a pleasure catching up in person.

    The Handheld Group (not to be confused with Hand Held Products, which is now part of Honeywell) is a remarkable rugged mobile computing success story. Having its origins as a distributor of vertical market mobile computers from the likes of Husky, TDS and others, Handheld went on to establish its own identity with its own distinct product lines. In fact, they call themselves a "virtual manufacturer."

    What does that mean? Well, while it is not unusual for larger distributors to resell OEM products under their own name, Handheld went one step farther. They not only have their own brands (Nautiz for rugged handhelds, Algiz for rugged tablets), but also their own design language and color scheme (Sofia even knew the precise Pantone color numbers), and they often have exclusive arrangements with their OEMs. So in addition to having a very cohesive brand identity and consistent look, Handheld products are less likely to immediately be identified by industry followers as rebranded versions of a common design.

    How has that worked out for the Handheld Group? Apparently quite well. They now have ten facilities all over the world, as well as several hundred authorized partners. And they've been able to score impressive wins like a contract for 10,000 rugged handhelds with Netherland Railways against much larger competition.

    They also proved their knack for coming out with the right product at the right time with devices such as the Algiz 10X (a rugged but light and handy 10-inch tablet), the Algiz XRW (a super-compact rugged notebook), and the Nautiz X1, which they call the toughest smartphone in the world. On the surface, that doesn't sound all that terribly exciting, but it really is, and here's why:

    I am on record as bemoaning the demise of the netbook, those small and handy notebooks that used to sell by the tens of millions. Then they disappeared due to a combination of being replaced by consumer tablets, and, even more so, an unfortunate industry tendency to keep netbooks so stunted in their capabilities as to render them virtually useless for anything but the most basic tasks. Well, now that they are gone, many wish they could still get a small, competent Windows notebook that's tough and rugged, but isn't as big, expensive and bulky as a full-size rugged notebook. And that is the Algiz XRW. I've liked it ever since I took an early version on a marine expedition to the Socorro islands a couple of years ago (see Case Study: Computers in Diving and Marine Exploration. And the latest is the best one yet (see here).

    The Algiz 10X likewise is a Q-ship (i.e. an innocuous looking object that packs an unexpected punch). On the surface, it's just a rugged legacy tablet, albeit a remarkably compact and lightweight version. And while that is mostly what it is, the 10X hits a sweet spot between old-style rugged tablet and new-style media tablet. One that will likely resonate with quite a few buyers who still need full Windows 7 and full ruggedness on a tablet and also some legacy ports, all the while also wanting a bright wide-format hi-res screen and a nice contemporary look.

    Then there's the Nautiz X1 rugged smartphone, and that's a real mindblower. By now there are quite a few attempts at providing consumer smartphone functionality in a tougher package, but none as small, sleek and elegant as the Nautiz X1. It measures 4.9 x 2.6 inches, which is exactly the size of the Samsung Galaxy S2 (the one before Samsung decided to make the displays almost as big as a tablet). At 0.6 inches it's thicker, and it weighs 6.3 ounces, but for that you get IP67 sealing (yes, totally waterproof), a ridiculously wide -4 to 140 degree operating temperature range, and all the MIL-STD-810G ruggedness specs you'd usually only get from something big and bulky. Which the Nautiz X1 definitely is not.

    In fact, with its gorgeous 4-inch 800 x 480 pixel procap screen, Android 4.x, and fully contemporary smartphone guts, this is the tough smartphone Lowe's should have looked at before they bought all those tens of thousands of iPhones (see here). Don't get me wrong—I adore the iPhone, but it's devices like the Handheld Nautiz X1 that belong in the hands of folks who use smartphones on the job all day long, and on jobs where they get dropped and rained on and so on.

    I don't know if Handheld is large enough to take full advantage of the remarkable products they have. They've done it before with that big contract in the Netherlands. But whatever may happen, it's hard not to be impressed with their fresh and competent products that go along with their great people, and their fresh and competently executed business plan.

    Posted by conradb212 at 3:59 PM

    April 24, 2013

    Itronix RIP

    Last week, as I came to a stop at a red light, a police car stopped in the lane next to me. What immediately caught my eye was an expertly mounted rugged notebook computer, angled towards the driver. It was a GD-Itronix rugged notebook, probably a GD6000 or GD8200, with an elegant matte-silver powder-coated insert on top of the magnesium alloy computer case that prominently featured the "General Dynamics" brand name. The officer perused the screen, then looked up, and briefly our eyes met. He had no idea how well I knew that computer in his car, and the one that came before it, and the one before that.

    I began following Itronix in the mid-1990s when their rugged notebooks still carried the X-C designation that stood for "Cross Country." Around that time, Itronix purchased British Husky and with that came the FEX21, and since Windows CE was starting to come on strong in smaller rugged devices, Itronix also introduced the tough little T5200 clamshell. I remember a call with Itronix in 1996 or so when I was watching my infant son in the office for an hour or two while his mom was shopping. The little guy was not happy and screamed his head off the entire time I was on the phone with Matt Gerber who told me not to worry as he had a couple of young kids himself. I remember hoping he didn't think we were running a monkey operation.

    Around the turn of the millennium, Itronix in a clear challenge to Panasonic's rugged, yet stylish Toughbooks, launched the GoBook. It was a clean, elegant, impressive machine with such cool features as a waterproof "NiteVue" keyboard with phosphorescent keys, and seamless, interference-shielded integration of a variety of radio options. I was impressed.

    That first GoBook would quickly evolve into larger, more powerful versions and then spawn a whole line of GoBook branded rugged notebooks, tablets and interesting new devices such as the GoBook MR-1 that measured just 6 x 4.5 inches, yet brought full Windows in a super-rugged 2.5-pound package to anyone who needed the whole Windows experience in such a small device. On the big boy side came the impressive GoBook II, then III, and then "Project Titan," the incomparable GoBook XR-1. At the time we said that it had "raised the bar for high performance rugged notebooks by a considerable margin. It has done so by offering a near perfect balance of performance, versatility, ruggedness and good industrial design." High praise indeed, and totally deserved.

    Itronix also branched out into the vehicle market with the semi-rugged GoBook VR-1 and into tablets with first the GoBook Tablet PC and then the GoBook Duo-Touch that combined both a touchscreen and an active digitizer into one small but rugged package. But even that wasn't all. With the introduction of the GoBook VR-2 came DynaVue, a truly superior new display technology that just blew my mind. Tim Hill and Marie Hartis had flown down from Spokane to demonstrate DynaVue on the new VR-2, and both could hardly contain their excitement. DynaVue ended up revolutionizing rugged systems display technology with a very clever combination of layering of filters and polarizers, and its approach became the basis of outdoor-viewable display technology still in use today.

    I'll never forget a factory tour of the Itronix main facility in Spokane, meeting and speaking with some of the most dedicated engineers, designers, product planners and marketing people in the industry. I visited their ruggedness testing (I always called it "torture testing") lab which rivaled what I had seen at Intermec and at Panasonic in Japan. I spoke with their service people, the folks on the shop floor and with management. What a talented and enthusiastic group of people. The sky seemed the limit. (See report of the 2006 Spokane factory tour)

    But change was brewing. Itronix's stellar performance had attracted suitors, and giant defense contractor General Dynamics, then a US$20 billion company with some 70,000 staff, felt Itronix would nicely complement and enhance its already massive roster of logistics, computing and military hardware. The sale had come as no surprise. Everyone knew it was eventually going to happen. Equity investment firm Golden Gate Capital had purchased Itronix in 2003 from former parent Acterna with the intent of prepping Itronix for a sale. Within just two years, Itronix prospered enough to make it a lucrative proposition for General Dynamics. Within Itronix, the hope was that the sheer mention of the name "General Dynamics" would open doors.

    In our GoBook VR-1 review we cautiously offered that "the change in ownership will be both a challenge and a tremendous opportunity for Itronix."

    Turns out we were right about the challenge part. The "GoBook" was quickly dropped in favor of a GDxxxx nomenclature, and with it the laboriously earned GoBook brand equity. There were attempts to somehow merge another GD acquisition, Tadpole, into Itronix, and that turned out to be difficult. No one seemed to know what to expect. And then the hammer fell.

    In early 2009, General Dynamics announced it would phase out the Itronix computer manufacturing and service facility in Spokane, Washington and operate the business out of Sunrise, Florida where the company's C4 Systems division had an engineering facility instead. It was a terrible blow for Spokane, where losing Itronix meant the loss of almost 400 jobs. And the cross-country move meant Itronix lost most of what had made Itronix the vibrant company it had been.

    It was never the same after that. On the surface things continued to look good for a while. There seemed a cohesive product line with GD2000, GD3000, GD4000, GD6000 and GD8000 rugged computing families. But from an editorial perspective, we were now dealing with people who didn't seem to know very much about the rugged computing business at all. And there no longer seemed a direction. Some of the final products were simply rebadged products from other companies. Eventually, there was mostly silence.

    In January 2013, I was told that "after in-depth market research and analysis, we have determined that it is in the best interests of our company, customers and partners to phase out a number of our General Dynamics Itronix rugged computing products." In April 2013 came the end: "Itronix has phased out all products."

    That's very sad. A once great company gone. Could it have been different? Perhaps. But Itronix was often fighting against the odds. Even in its heydays, Itronix primarily worked with Taiwanese OEMs whereas its major competitors at Panasonic and Getac controlled their entire production process. In addition, while its location in Spokane was a calming constant, Itronix ownership was forever in flux. Itronix was started in 1989 as a unit of meter-reading company Itron to make rugged handheld computers. It was spun off from Itron in 1992, then sold to rugged computer maker Telxon in 1993. In 1997, telecom testing gear company Dynatech Corp. bought Itronix from Telxon for about $65 million. Dynatech changed its name to Acterna in 2000, but fell on hard times and sold Itronix to private equity firm Golden Gate Capital in 2003 for just US$40 million in cash. Golden Gate held on to it for a couple of years before General Dynamics came along. -- The band Jefferson Starship comes to mind here, with Grace Slick charging "Someone always playing corporation games; Who cares they're always changing corporation names..."

    Perhaps there could have been a management buyout. Perhaps the City of Spokane could have helped. But that didn't happen, and though in hindsight it seems like a natural, there are always reasons why things happen the way they happen.

    As is, there once was a superbly innovative company called Itronix, and they did good. I will miss them, and so probably will everyone interested in rugged computing equipment. I bet the police officer I saw with his Itronix laptop will, too.

    Posted by conradb212 at 6:52 PM

    March 27, 2013

    Xplore adds Common Access Card reader-equipped rugged tablet for military and government

    This morning, March 27, 2013, Xplore Technologies introduced a new version of their ultra-rugged tablet computer, the iX104C5-M2. In essence, this is a specialized model for military and government personnel that require additional hardware security on top of the various security hardware, software and firmware measures already inherent in modern computing technology. What the new M2 model adds is an integrated common access card (CAC) reader. With the reader, in order to get access to critical data, a U.S. government issued ISO 7816 smart card must be inserted.

    Why is the ability to read such cards and to provide data access only with such a card important? Because it's mandated in directives and policies such as the Homeland Security Presidential Directive 12 (HSPD-12) that requires that all federal executive departments and agencies use secure and reliable forms of identification for employees and contractors. A chip in the CAC contains personal data, such as fingerprint images, special IDs and digital certificates that allow access to certain federally controlled systems and locations. As a result, both Federal agencies and private enterprise are now implementing FIPS 201-compliant ID programs.

    But what exactly do all those card terms mean? What's, for example, the difference between a CAC and a PIV card, and how do they relate to Smart Cards in the first place? Well, the term "smart card" is generic. It's simply a card with a chip in it. The chip can then be used for data storage, access, or even application processing. A CAC is a specific type of smart card used by the US Department of Defense. A PIV (Personal Identification Verification) card is also a FIPS 201-compliant smart card used by the Federal government, but it's for civilian users. Then there's also a PIV-I smart card where the "I" stands for "Interoperable," and that one is for non-Federal users to access government systems.

    The way a CAC works, specifically, is that once it's been inserted into the CAC reader, a PIN must be entered and the reader then checks via network connection with a government certificate authority server, and then either grants or denies access. The CAC stays in the reader for the entire session. Once it's removed, the session (and access) ends.

    What this means is that only computers that have a CAC reader can be used for certain military and other governmental work. And the new Xplore iX104C5-M2 provides that reader. It's built directly into the chassis where it is secured and protected.

    I had a chance to talk with Xplore Technologies representatives prior to the release of the new model. They said they created this new version specifically to meet the requirements of the Department of Defense, the Air Force, and Homeland Security. According to them, the potential market for CAC-equipped ruggedized tablet is 50,000-100,000 units. Considering that a rugged Xplore tablet lists for over US$5k, that would value that market at between half a billion and a billion dollars. Xplore's competition, of course, will step up to bat as well, and not all CAC-equipped computers will require the superior ruggedness and portability of an Xplore tablet,. But it's easy to see why Xplore, a company with roughly US$30 million in annual sales, would throw its hat in the ring. Even winning a small percentage of the estimated value of this market could have a sizable impact on Xplore.

    While I'm at it, let me recap what Xplore Technologies is all about and what they have to offer. Unlike the flood of Johnny-come-latelies attempting to grab a slice of the booming tablet market, Xplore has been making tablets for the better part of two decades. Starting in the mid-1990s, the company began offering some of the finest and most innovative rugged tablet computers available. They were at the forefront with integrating multiple wireless options into a rugged tablet, offering truly outdoor-readable displays, and providing dual mode digitizers that automatically switched between active pen and touch. We reviewed their current iX104C5 tablet platform in detail a couple of years ago (see here) and declared it "one of the best rugged tablet designs available today." In the meantime, Xplore has broadened the appeal of the platform with a number of versions targeted at specific industries and clienteles, and this latest M2 version continues that tradition with a very timely product.

    See the Xplore iX104C5-M2 product page.

    Posted by conradb212 at 6:58 PM

    March 6, 2013

    When the fire chief wants iPads instead of rugged gear

    The other day I was engaged in a conversation at a party. Turns out my conservation partner was the fire chief of an affluent community of about 120,000. We talked about our respective jobs and soon found we had something incommon: rugged computing equipment. They use Panasonic Toughbooks, but the fire chief said something that has been on my mind for a while now. He said they liked the Toughbooks just fine, but he considered them much too expensive and they'd just buy iPads instead. He said he doesn't care if the iPads break, they'll just replace them with new ones because consumer tablets cost so little.

    I can see that rationale. It's one thing if a professional tool costs 50% more than a consumer grade tool. But another if the professional tool costs five to ten times as much. Over the past few years I've seen large chains buy massive numbers of consumer smartphones and tablets instead of the rugged industrial-grade handhelds and tablets they used to buy. Sometimes it seems like the rugged computing industry is missing out on a great opportunity to benefit from the boom in smartphones and tablets by staying with older technologies and very high-end pricing instead of offering ruggedized versions of what today's consumers want.

    Posted by conradb212 at 3:57 PM

    February 7, 2013

    Not your father's Celeron

    In my last blog article I wrote about the needless demise of netbooks, and how that demise was due more to the fact that people loved the rock-bottom price of netbooks but then found them too small and lacking in performance, so they asked for more size and performance. The industry complied with larger, more powerful netbooks, but that meant they cost more and netbooks weren't netbooks anymore. So people stopped buying them. I also wrote how, in my opinion, Intel's inexpensive Atom processors both made the netbook by making the low price possible, but then contributed to the demise of the netbook with their often unacceptable performance. Unfortunately, the unacceptable performance level of Atom processors also affected a lot of other industrial and vertical market devices based on Atoms.

    So we have this unfortunate situation: Atom processors (of which there are by now about 50 different models) don't cost a lot, usually well under US$100, with some in the US$20 range. But they are also very marginal performers, so much so that not only netbook vendors abandoned them, but also a good number of vertical market manufacturers which quietly switched to "real" Intel Core processors. Unfortunately, even the low-end mobile Core i3 chips cost in the low US$200 range, and mobile Core-i7 chips usually closer to US$400. Those are huge price differences with major impacts on low-cost consumer electronics (though one would think far less impact on much higher priced vertical market computers where the processor makes for a much lower percentage of the overall purchase price).

    What that all means is that there's an unfortunate gap between the inexpensive but rather underpowered Atom chips, and the beefy but much more expensive Core processors. (Oh, and while I'm at it, here's basically the difference between the by now three generations of Intel Core chips: First gen: good performance, but power hogs with insufficient graphics. Second gen: good performers with much better gas mileage but still sluggish graphics. Third gen: good performance, economical, and much better graphics.) But now to get back to the big gap between Atoms and Core processors: there's actually something in-between: Celerons and Pentiums.

    Celerons and Pentiums? But weren't Pentiums old chips going back to the early 1990s and then being replaced by Core processors? And weren't Celerons bargain-basement versions of those old Pentiums? Yes that's so, but there are still Celerons and Pentiums in Intel's lineup, and they are NOT your father's Celerons and Pentiums, they are now slightly detuned versions of Core processors. They should really call them Core-i1 or some such.

    But let me explain, because those new-gen Celerons and Pentiums may well be one of the best-kept secrets in the processor world. If you go to the Intel website and look up their mobile processor lineups, you'll find them neatly organized by generation and then by Core Duo, Core 2 Duo, i3, i5, and i7. Celerons are still listed as either Celeron M Processors or Celeron Mobile Processors. The Celeron Ms are old hat and many go back a decade or more. The Celeron Mobile processors, however, include many models that are much newer, with the Celeron 10xx low and ultra-low voltage models launched in Q1 of 2013, i.e. as of this writing. I would have never noticed this, and probably would have continued thinking of Celerons as obsolete bargain processors, had it not been for an Acer mini notebook I just bought as a replacement for my vintage (2008) Acer Aspire One netbook.

    You see, my new Aspire One has an 11.6-inch 1366 x 768 pixel screen and is really still a netbook, with netbook looks and a netbook price (I bought it as a refurb for US$250), but it has a Celeron instead of an Atom processor. The 1.4GHz Celeron 877, to be exact, introduced Q2 of 2012, and an ultra-low voltage design with a thermal design power of 17 watts. It uses the Sandy Bridge architecture of the second gen Core processors, and reportedly costs about US$70, no more than a higher end Atom chip, and only about US$25 more than the Atom N2600. Now how would that work, a real Sandy Bridge, non-Atom chip in a netbook?

    Turns out very well.

    The Celeron-powered little Acer ran a 1,261 PassMark CPU score compared to Atom-powered devices in our rather comprehensive benchmark database reaching from a low of 163 (Atom N270) to a high of 512 (D510). The Celeron ran CrystalMark memory benchmarks between two and five times faster than the Atoms, and CrystalMark GDI benchmarks between three and five times faster. The Celeron 877 netbook also powered through most other benchmarks much faster than any Atom-based device. As a result, by netbook standard this new son-of-netbook absolutely flies. And it handles HD video, a big sore sport with early netbooks, without a problem.

    But what about battery life and heat? After all, most of those mobile Atom chips have minuscule thermal design power of between two and five watts (with the exception of the D510, which is at 13 watts) whereas, though designated a "ultra-low power" chip, the Celeron's TDP is 17 watts. Reviews on the web complain about insufficient battery life of this particular Acer Aspire One (the AO756-2888). Well, that's because to keep the price low, Acer gave this netbook only a wimpy 4-cell 37 watt-hour battery. Most earlier netbooks had beefier 6-cell batteries.

    In real life, our benchmark testing always suggested that Atom power management was relatively poor whereas ever since Sandy Bridge (second gen) Core processor power management has been excellent. So the difference between Atom and Core-based power consumption can be a lot less than one would assume based on TDP alone. And that was exactly what I found with the Celeron in this new Acer netbook. BatteryMon power draw (with WiFi on), was as little as 6 watts. That's actually LESS than what we have observed in a good number of Atom-powered devices (and also less than my old 2008 Atom N270-powered Acer netbook). Sure, the top end of the Celeron-based machine is so much higher that it can draw down the battery quicker than an Atom device, but under normal use, the Sandy Bridge guts of the Celeron handle power management very, very well. As for heat, my new little Acer has a quiet fan, but it actually stays cooler and the fan comes on less often than that in my 2008 Atom-based netbook.

    I am not an electric engineer and so my conclusions about relative processor performance come from benchmarking, real life experience, and perusing Intel's tech specs. Based on that, I'd have to say the Pentium and Celeron versions of Intel's Core processors deserve a whole lot more attention in the mobile space. I don't know what it actually looks like at the chip level, but it feels like Intel starts with essentially one design, then adds features here and there (like all the extra Intel technologies in the Core i7 chips) and omits or perhaps disables those them in lower level chips. As a result, the inherent goodness and competence of an Intel Core chip generation may be available in those little-known Celeron and Pentium versions, if not all of the features of the more expensive SKUs.

    What does that all mean? Obviously, for those who need to run the latest and most 3D-intensive video game at insane frame rates, only the very best and most expensive will do. And the same goes for those who absolutely need one or more of those extra features and technologies baked in or enabled in i5 and i7 chips. If that is not an issue, the Celeron versions may just be a very well kept secret and a terrific bargain. The Celeron 877 sitting in my lowly new netbook absolutely runs rings around any Atom-based device, and it does so without even trying hard and while treating battery power as the precious commodity it is.

    So.... if I were a designer and manufacturer of vertical market industrial and rugged devices, I'd think long and hard before committing to yet another underpowered Atom chip that'll leave customers underwhelmed before long, and instead check what else Intel may have in its parts bin. There are some real bargains there, good ones.

    Posted by conradb212 at 4:02 PM

    February 4, 2013

    The needless demise of the netbook

    Three or so years ago, netbooks sold by the millions. Today, they're gone, replaced by tablets and larger, more powerful notebooks. What happened? I mean, it's not as if tens of millions of people wanted a netbook a few years ago, and today no one wants one.

    What's not to like about a small and handy notebook computer that runs full Windows and costs a whole lot less than even inexpensive larger notebooks? So much less that the purchase price of a netbook was close to making it an impulse buy.

    The problem was, of course, that while the price was right, netbooks themselves weren't. Slowly running Windows on a very small display with marginal resolution quickly turned the netbook experience sour. The very term "netbook" implied quick and easy access to the web, an inexpensive way to be online anytime and anywhere. Well, netbooks were so underpowered as to make that browsing and online experience painful. It didn't have to be that way, but market realities painted the netbook into a corner where it withered and died.

    It's not that the technology wasn't there to make netbooks fast and satisfying enough to become a permanent addition to what consumers would want to buy. And it wasn't even that the technology required to make netbooks as powerful as they needed to be without disappointing customers would have been too expensive. It's just that making such products available would have cannibalized more profitable larger notebooks. And consumers who demanded larger, more powerful netbooks at the same low price also weren't thinking it through.

    There's a reason why compact technology demands a premium price. An unsubsidized 3-ounce smartphone costs as much as a 50-inch HD TV. A loaded Mini Cooper costs as much as a much larger SUV or truck. And ultra-mobile notebooks have always cost more than run-of-the-mill standard ones. It's the MacBook Air syndrome that dictates that sleek elegance and light weight costs extra. Netbooks broke that rule by promising the full Windows experience in an ultra-compact device at an ultra-low price.

    You can't do that in the Wintel world. Something had to give. And that was acceptable performance. I would not go as far as declaring Intel's entire Atom project as a frustrating, needless failure as there are many Atom-based products that work just fine. But the whole approach of making processors not as good and fast as they could be but throttled and limited enough so as not to interfere with sales of much more expensive processors is fundamentally flawed. It's like promising people an inexpensive car, but then they find out it can't drive uphill.

    So netbooks were flawed from the start in infuriating ways. The 1024 x 600 display format endlessly cut off the bottom of just about everything because just about everything is designed for at least a 1024 x 768 display. And that was the least of netbooks' annoying traits. Performance was the biggest problem. The Atom N270 processor in almost all early netbooks had painfully insufficient graphics performance, and was completely unable to play the HD video that people could generate on every cheap camera and phone. The endless wait for a netbook to complete any task beyond the very basics quickly turned people off. Yes, the small size and weight, the low cost, and the good battery life sold tens of millions of netbooks, but their inadequacy soon relegated them to the dustbin. In my case, I quickly realized that a netbook did not replace a larger notebook with standard performance; it just meant I had to take with me both the netbook AND the real computer.

    So people demanded more. The original netbooks had 7-inch screens, but that quickly grew to 8.9 inches for all those Acer Aspire Ones and Asus Eee PCs. And then that wasn't large enough and so the netbook vendors switched to 10.1 inch screens. And then to whatever new Atom processors Intel introduced. Then tablets came and it was just so much easier, quicker and more pleasant to use a tablet to browse the web that the netbooks' shortcomings became even more evident.

    With netbooks' fortunes waning but the iPad's tablet success turning out to be frustratingly difficult to copy, netbook vendors gave it one last shot. 11.6 inch screens with full 1366 x 768 720p resolution. AMD processors instead of Atom (short-lived and unsatisfactory). And finally ditching the Atom in favor of Intel Celeron and Pentium chips, which had little to do with the Celeron and Pentium M chips of yore but simply were wing-clipped version of Intel's Core processors. By doing that, netbooks ceased to be netbooks. They had become smallish notebooks with decent performance, but without the endearing compactness, small weight and rock bottom prices that once had given netbooks such allure.

    And battery life suffered as well.Try as anyone might, it's just not possible to run a 11.6 inch screen and a 17-watt Celeron or Pentium for nearly as long on a battery charge as an 8.9-inch screen with a 2-watt Atom. So that quality of netbooks was gone, too.

    Where does that leave all those folks who wanted a cheap and simple little notebook for when space, cost and weight mattered? Nowhere, really. Tablets are wonderful and I wouldn't want to be without mine, but they are not full-function computers. Not as long as real productivity software isn't available for them, and not as long as tablet makers act as if something as simple and necessary as being able to do or look at two things at once were the second coming. Fewer dropped calls, anyone?

    So for now, if you peruse Best Buy or CostCo or Fry's ads, you either get a tablet or a notebook with a 14-inch screen or larger, or you spring for an expensive Macbook Air or an Ultrabook.

    That leaves a big void, and a bad taste in the mouth. For the fact is that there could be totally competent netbooks in the impulse buy price range if it weren't for the reality that Intel makes all those pricey Core processors that all by themselves can cost several times as much as a basic netbook. Which means the myth that you need a real Intel Core processor to run Windows and not just some wimpy ARM chip must be upheld. Personally, I do not believe that for a second, but the financial fortunes of two major technology companies (Microsoft and Intel) are built upon this mantra, and that won't change unless something gives.

    So what did I do when my little old 8.9-inch Acer Aspire One finally gave out? First despair because I couldn't find a contemporary replacement, then grudgingly accept the reality of the netbook's demise and buy a new Aspire One, one with an 11.6-inch 720p screen and a Celeron processor. I got a refurbished one from Acer because it was cheaper and comes with Windows 7 instead of Windows 8. So there.

    But what if a low, low price is not the issue and you want something really rugged in the (former) netbook size and weight category? Then you get an Algiz XRW from the Handheld Group. It's small and light enough, runs forever on a charge thanks to using a little engine that for the most part can (the Atom N2600), and has a 720p screen good enough for real, contemporary work. And it's for all practical purposes indestructible.

    Posted by conradb212 at 7:30 PM

    January 14, 2013

    On the Microsoft front ...

    Well, on the Microsoft side of things, a couple of areas are becoming a bit clearer. Not much, but a bit.

    At the National Retail Federation (NRF) Annual Convention & Expo in New York, Microsoft issued a press release entitled "Microsoft Delivers Windows Embedded 8 Handheld for Enterprise Handheld Devices." That title is a bit misleading as those handhelds, prototypes of which were shown by Motorola Solutions, are not available yet, and Microsoft won't even release the Windows Embedded 8 Handheld SDK until later this year. However, after having stranded the vertical and industrial market with the by now very obsolete Windows Embedded Handheld (nee Windows Mobile 6.5) for a good couple of years, at least now it looks like Microsoft will offer a vertical market version of Windows Phone 8 for all those who want a handheld with a Microsoft OS on it instead of Android.

    There will, of course, not be an upgrade path from Windows Mobile/Embedded Handheld to Windows Embedded 8 Handheld, just as there wasn't one from Windows Mobile to Windows Phone 7, or from Windows Phone 7/7.5 to Windows Phone 8. Still, at least having the prospect of soon getting an up-to-date mini Windows OS that's reasonably compatible with Windows 8 itself should be a huge relief to all those rugged handheld manufacturers who've been under increasing pressure of offering Android-based devices. Then again, Microsoft once again pre-announcing a product that doesn't even ship its SDK yet will also further perpetuate the uncertain vertical market handheld OS status quo, and likely lead to more customers deciding to simply get readily available consumer smartphones instead of waiting for the vertical market smoke to clear.

    On the tablet side, we have the, by most accounts, less than stellar reception of Windows 8. Microsoft will likely correct the situation with Windows 8 over time, but as far as tablets go, it's pretty easy to draw some preliminary conclusions: Like, no matter how good the Windows Surface RT tablet hardware was/is, without being able to run what most people will consider "Windows" for many years to come, Windows RT is simply not going to fly. If the Metro interface were a runaway hit and there were tons of Metro apps, perhaps. But as is, anyone who needs to use any "legacy" Windows software is out of luck with Windows RT. So it's a Windows CE situation all over again: Windows RT must not be too powerful or else it'll eat into Windows 8 marketshare. And there can't be a perception that ARM-based tablets are capable of running "real" Windows, or else there'd be no reason to spend a lot more for Intel-based tablet.

    Posted by conradb212 at 6:11 PM

    January 4, 2013

    Big changes at General Dynamics Itronix

    Eagle-eyed RuggedPCReview readers may have noticed something missing from the front page of our site: the General Dynamics Itronix logo in the site sponsor column. Yes, for the first time since the launch of RuggedPCReview, Itronix is not among our sponsors anymore. That's sad as Itronix was our first sponsor, and prior to that we had covered all those rugged Itronix GoBooks and other rugged mobile devices in Pen Computing Magazine since the mid-1990s.

    What happened? We're not sure, but an email exchange with Doug Petteway, General Dynamics C4 Systems director of product management and marketing yielded that the company is "restructuring its portfolio of rugged products to focus more on high value targeted solutions rather than the mass commodity market" and that while they'll continue selling the GD6000, GD8000 and GD8200 rugged notebooks through early 2013, the entire rest of the lineup of Itronix rugged mobile computing products is discontinued.

    Petteway made the following statement:

    "At General Dynamics C4 Systems, we have a set of core capabilities that we are leveraging aggressively to expand and grow in key markets. To maximize our potential for success, we must continually assess and refine our portfolio, investing in critical gap-filling capabilities that enable us to deliver highly relevant “must-have” solutions while also phasing out offerings that are no longer in high demand, freeing up valuable investment resources.

    After in-depth market research and analysis, we have determined that it is in the best interests of our company, customers and partners to phase out a number of our General Dynamics Itronix rugged computing products. This decision may affect the solutions customers buy from us today. Please know that General Dynamics C4 Systems’ management team wants to assure you that our customer needs remain our first priority.

    As always, customer satisfaction is paramount and we will continue to ensure our customers receive the service and support in full accordance with our warranty commitments.

    We remain focused on being an industry leader with proven, high value communications, computing, security and mobility solutions for our customers.

    Additional announcements will be made in the near future."

    That doesn't sound very good, and not having all those rugged Itronix notebooks and tablets available anymore is a big loss. We wish Itronix all the best, whatever course General Dynamics has in mind for them.

    Posted by conradb212 at 12:06 AM

    November 30, 2012

    Surface with Windows 8 Pro pricing contemplations -- an opportunity for traditional vendors of rugged tablets?

    On November 29, 2012, Microsoft revealed, on its Official Microsoft Blog (see here), pricing for its Surface with Windows 8 Pro tablets. The 64GB version will cost US$899 and the 128GB version runs US$999. That includes a pen but neither the touch or the type cover. They cost extra.

    So what do we make of that?

    Based on my experience with the Surface with Windows RT tablet, I have no doubt that the hardware will be excellent. With a weight of two pounds and a thickness of just over half an inch, the Pro tablet is a bit heavier and thicker than the RT tablet, but still light and slim by Windows tablet standards. The display measures the same 10.6 inches diagonally, but has full 1920 x 1080 pixel resolution compared to the 1366 x 768 pixel of the RT tablet. That's the difference between 1080p and 720p in HDTV speak. There's a USB 3.0 port and a mini DisplayPort jack. Under the hood sits a 3rd Gen Intel Core i5 processor as opposed to the nVidia Tegra 3 ARM chip in the RT model. And both RAM and storage are twice of what the RT tablet has. All that certainly makes for an attractive tablet.

    What customers of a Surface with Windows 8 Pro get is a modern and rather high performance tablet that can be used with a pen or a mouse in desktop/legacy mode, and with touch in the new Metro mode with all the live tiles and all. You can use the pen in Metro mode, of course, but Metro wasn't designed for that. And you can use touch in legacy mode, but as 20 years of experience with Windows tablets has shown, legacy Windows does not work well with finger touch. Still, this will most likely be good hardware that makes full Windows available in a tablet, and also allows evaluating Metro in its native mode.

    But let's move on to the ever important price. And here Microsoft faced an unenviable task. Microsoft tablets had to be price-competitive with the iPad, and the Surface RT tablets are. Except that so far they have not been accepted as "real" Microsoft tablets because they cannot run legacy Windows software. The Windows 8 Pro tablets are real Windows tablets, but they now cost more than iPads. Sure, they have more memory and ports and a memory card slot and an Intel Core processor, but the perception will still be that they cost more than iPads and are thus expensive. That's somewhat unfair because the i5 processor in the Microsoft tablet alone costs costs more than most consumer Android tablets. But this is an era where you can get an impressive, powerful and full-featured notebook for 500 bucks or so, and a sleek Ultrabook for well under a grand. That makes the tablet look expensive.

    Price, in fact, has always been a weak spot with Windows-based tablets. Witness a bit of tablet history: the first pen tablets in the early 1990s cost almost $4,000. Even in an era where notebooks cost much more than what they cost today, that was too much, and it was one of the several reasons why early pen tablets failed in the consumer market. Tablets did remain available in vertical markets throughout the 90s, albeit usually at around $4,000.

    In 2001/2002 Microsoft tried again with their Tablet PC initiative. The goal there was to bring the tablet form factor, beloved by Bill Gates himself, to the business and consumer markets. The price was to be lower and to make that possible Microsoft initially mandated the use of inexpensive Transmeta processors. When they turned out to be too slow to drive the WIndows XP Tablet PC Edition at an acceptable clip, everyone turned to Intel and the average 2002-style Tablet PC ran around US$2,000. Which was still too expensive for the consumer market where customers could pick up a regular notebook for less.

    Unfortunately, while two grand was too steep for consumers, the side effect was that companies like Fujitsu, Toshiba, and everyone else who had been selling tablets in the 90s now had to offer theirs for half as much as well, losing whatever little profit came from tablet sales in the process. What's happening now is that the Surface for Windows 8 Pro again halves the price people expect to pay for a tablet. And again there may be a situation where the public considers Microsoft's own Windows 8 tablets as too expensive while the verticals have to lower their prices to stay competitive with Microsoft itself.

    And that won't be easy. Vertical market vendors have done a remarkable job in making business-class Windows 7 tablets available for starting at around US$1,000 over the past year or so. But those tablets were almost all based on Intel Atom processors which are far less powerful than what Microsoft now offers in their own Windows 8 Pro tablets. So we have a situation where Intel pushed inexpensive Atom processors to make inexpensive tablets possible, but Microsoft itself has now upped the ante for its licensees by offering much more hardware for less.

    Ouch.

    It's hard to see how this could possibly leave much room for the traditional makers of business-class Windows tablets. Unless, that is, they find a way to compellingly answer the one question we've been hearing ever more loudly over the past couple of years: "we need a tablet like the iPad, but it must run Windows and be a lot more rugged than an iPad." Well, there's the niche. Tablets that match the iPad's style and Microsoft's newly established hardware standard, but a whole lot tougher than either and equipped with whatever special needs business and industrial customers have.

    That ought to be possible. The traditional vertical market tablet makers and sellers already know their markets. And unlike the designers of consumer market tablets, they know how to seal and protect their hardware and make it survive in the field and on the job. What that means is that Microsoft's pricing for their Surface tablets may well be a glass half full for the rugged computing industry, and not one half empty.

    Anyone for a sleek yet armored ULV Core i5 or i7-powered, IP67-sealed tablet with a 1080p dual-mode and sunlight viewable procap/active pen input display, a 6-foot drop spec, dual cameras with a 4k documentation mode, 4G LTE, and integrated or modular scanner/RFID/MSR options?


    Posted by conradb212 at 8:31 PM

    November 21, 2012

    Windows RT: how suitable is it for vertical markets? (Part II)

    I had planned a quick follow-up on my first impressions of the Microsoft Surface RT tablet and Windows RT in general. But now it's almost a month later, so why the hesitation?

    It's not because of Microsoft's hardware. I am as impressed with the Surface RT tablet as I was when I first took it out of its box. It's a truly terrific device. If after a month of use about the only gripe is that you still can't easily find the on-off button, you know the hardware itself is good. So no issues there. It never gets hot or even warms up. Battery life is practically a non-issue, like on the iPad. It's plenty fast enough. Honestly, the argument that for real performance and real work you need an Intel processor is pretty thin. What it really feels like is that Microsoft is in the difficult spot of having to artificially hold ARM hardware back via Windows RT so that it won't compete too much with Intel hardware, but at the same time Microsoft doesn't want to come across as being uncompetitive on ARM platforms. Tough position to be in.

    And then there's the whole concept of Windows 8. I really did not want to get into a discussion of operating systems, but Microsoft makes it hard not to. Especially if you've been covering Microsoft's various mobile and pen/touch efforts over the years.

    One giant problem is that Microsoft still does not want to let go of the "Windows on every device" maxim. So Windows 8 is on the desktop, on notebooks, on tablets and on phones. With Microsoft claiming it's all the same Windows, though it's really quite unclear to most whether it's really the same Windows or not. So from a practical perspective, what exactly is the advantage of the tile-based "Metro" look on all those very different computing platforms when you really can't run the same software anyway? Yes, the fairly consistent look is probably good for brand identity (as if Microsoft needs more of that), but it's inconvenient for users who have to deal with this one-size-fits-all approach at best.

    And there are some other issues.

    For example, what's the deal with the "flatness" of everything in Windows 8 and RT? Not long ago everything had to be 3D and layered, and now everything has to be completely flat? There is simply no good argument for that. 3D components on a screen always help making things more manageable and more obvious (let alone better looking), so complete flatness for complete flatness' sake seems weak.

    Then there's the peculiarly low density of almost everything I've seen so far in Metro. Maybe that's just because Metro is only getting started, but between the Kansas-like flatness and very little on the screen, it feels strange and empty, and it means a lot of panning left and right.

    And by far the biggest beef: why try to shoehorn everything into one operating system? It is very abundantly clear that traditional Windows apps, the kind that hundreds of millions use every day, are simply not for touch operation and may never be. Just because it's simple to touch here and there and use touch to consume information on small media tablets doesn't mean touch is the way to go with the much more complex interactive software most people use for work. Pretty much all of the creative work I do, for example, requires the pinpoint accuracy of a mouse: editing, image processing in Photoshop, layout in Quark Xpress, etc., etc. I cannot see how that can be replaced by just tapping on a screen.

    So from that perspective, it does seem like Microsoft has simply done what the company has done every time in the past 20 years when new and disruptive technology came along -- it paid lip service by putting a fashionable layer on top of Windows. That's what happened with Windows for Pen Computing (1992), the Pen Services for Windows 95 and then 98, and the Windows XP Tablet PC Edition (2002). Only this time the disruptive technology (tablets) has found widespread enough acceptance to really get Microsoft's attention.

    And a couple of personal peeves in Windows RT:

    First, I find the live tiles annoying. I find all the constant moving on the screen distracting, and in corporate environments it's certainly a constant distraction, with people getting sidetracked into consuming information. Let me make the decision what I want to do next, rather than have a screen full of tiles vying for my attention like a wall of alive pictures in a Harry Potter movie.

    Second, if Metro is indeed Microsoft's interface and operating environment of the future, does that mean we'll have come full circle from having just one app per screen to task switching to, finally, software that allowed as many windows as we wanted, just to get back to task-switching one-thing-at-a-time? That, given the right apps, may be good on small tablets, but it's definitely not the way I'd want to work on the desktop or even on a laptop.

    Oh, and a third... if Microsoft is concerned about being so far behind with available apps in its store, it really doesn't show. If they were concerned, why would the store be as ultra-low density as it is, with no way of quickly finding what you really want? The store interface seems minimal beyond a fault.

    But on to Windows RT and its suitability for vertical markets. That actually might work, although there are several big ifs.

    Windows RT for vertical markets: PRO

    Economical hardware -- Judging by the initial Surface RT tablet, ARM-based Windows RT-powered tablets could be a perfect solution for numerous vertical market deployments. They are light, simple, quick, don't heat up, get superior battery life, and they cost less.

    No virus/malware -- User don't have to worry about viruses and malware because a) the main focus of the bad guys will remain Windows 8 proper, and all software must come from the Microsoft app store. That could be a big argument for Windows RT.

    Device encryption -- There's device level encryption In Windows RT. That can be done in Windows 8 also (via BitLocker and other utilities), but in Windows RT it's in the OS itself.

    Custom stores --From what I hear, vertical market vendors will be able to have their own showrooms in the Microsoft store that only users of that vendor's hardware can see. That would/will be a great benefit for both users and vendors.

    Microsoft Office -- Microsoft Office comes with Windows RT. I haven't done a feature by feature comparison with "real" Office and there are those who says Office RT is a dumbed-down version of Office. All I can say is that Office RT will meet the needs of a whole lot of users. If it's dumbed down, it's infinitely less dumbed-down than Office on Windows CE and Windows Mobile was. There are, however, some licensing issues as, at least for now, Microsoft considers Office RT not for commercial use.

    Legacy and leverage -- Microsoft has always used the leverage argument ("your users and programmers already know Windows, and this will fit right in") , and Windows RT will probably benefit from that as well. It's curious how much of the age-old Windows utilities and apps actually run on Windows RT, and Windows RT will probably fit much more easily into a corporate Windows infrastructure than Android or iOS.


    Windows RT for vertical markets: CON

    Confusion -- You'll forever have to explain (and wonder) what exactly works and what doesn't work on Windows RT compared to Windows 8. Some may decide it's easier to just use Windows 8 instead.

    Still not pure tablet software -- Unlike with Android and the iPad, Windows RT users still have to fall back into desktop mode for Office and perhaps other functionality (settings, configurations, etc.) where touch just doesn't work well and you really need a mouse. You can use any USB mouse with Windows RT, but it's frustrating to never know if you need a mouse on your new tablet or not.

    Artificial limitations -- Since Windows RT is not to compete too much with the Wintel side of Windows 8, there are hardware and software limitations to deal with in Windows RT, whether they make sense or not. Users are the victims here.

    Vendor predicament -- How is a hardware vendor to make the call on Windows 8 versus Windows RT? Offer both? Make cheaper RT versions? That's exactly the kind of predicament vendors used to have with Windows versus Windows CE (CE lost).

    So for now, as far as the suitability of Windows RT for vertical markets goes, I'll have to give an "A" for current Windows RT tablet hardware. It's really excellent, and ARM-based hardware could really be a boon for integrators and vertical market vendors; a "B-" for Windows RT itself, because for now Metro is too limited to be of much use; and a "D" for clarity of concept as it's totally unclear where Microsoft is headed with RT.

    Posted by conradb212 at 5:30 PM

    October 27, 2012

    Windows RT: how suitable is it for vertical markets? (Part I)

    Though as of this writing, October 27, 2012, Windows 8 and RT were just officially unveiled a couple of days ago, reams have already been written on Windows 8 and to a much lesser extent, Windows RT. We got our Surface RT tablet on October 26 with the intent on reporting on the Surface hardware and RT software in some detail. However, our emphasis will be on their suitability for vertical and industrial markets.

    So what about Windows RT? The general word on it has been that it's a special version of Windows 8 for devices with ARM processors. A special version that will not be able to run any legacy Windows software, one that does not offer users the legacy desktop to go about their Windows business, and one where you cannot install software other than download it from the official Windows store. Engadget clearly stated in its review of Windows Surface: "Windows RT can't run legacy programs written for traditional, x86-based Windows systems."

    Is this all so?

    Yes, and perhaps no.

    So here's what we found so far on our Surface tablet.

    It comes with Microsoft Office 2013, and you run those versions of Word, Excel, PowerPoint and OneNote on the Windows RT desktop. We took screen shots of Word, Excel and PowerPoint, and here's what the apps look like (click on the pics for full-size versions):

    Note that Office RT isn't final yet. It'll be a free download when it is. From what I can tell (and I am not an Office expert), even what comes with Windows RT now is a full version of Office, and not some micro version like Windows CE/Mobile used to have. This is the real thing.

    Anyone who expected Office to be totally touch-optimized for Windows RT will be disappointed. It's not. You can use it with touch, but it can be a frustrating experience. And the touch keyboard doesn't help. Fortunately, you can simply plug in any old mouse or keyboard or mouse/keyboard combo and it works with Windows RT right off the bat.

    Below is a screen capture of an Excel presentation. (and yes, I picked the slide that shows Alan Kay predicting it all back in 1968, and the original IBM Thinkpad tablet from 1993 on purpose).

    If you take a closer look at our Word and Excel screen captures, you'll notice that not only are they in their own windows, we also have legacy Windows apps like Paint, Notepad, Calculator, the Math Input Panel, a system shell and the old performance monitor running. Interestingly, they do run (and many others, too, like Remote Desktop, Windows PowerShell, the whole Control Panel, etc.), and you can even pin them on the old Windows task bar. In fact, there's a lot of old legacy Windows stuff down in the basement of Windows RT. And much of it seems as functional as ever.

    I am not sure what to make of that. After all, Windows is not supposed to run on ARM, yet a good number of the old programs do run. There's probably a good explanation for that.

    Unfortunately, that doesn't mean you can simply install and run other old software. If you do, there's a message that says you can only install software from the Windows store.

    So what's our preliminary impression of Windows RT on a Surface tablet? Quite positive. The 1.3GHz quad-core Nvidia Tegra 3 CPU has plenty enough power to make RT tablets perform well. The Nvidia setup doesn't need a fan and the tablet never even warms up, at all. And it seems to run almost ten hours on a charge.

    Check back for more commentary on the suitability of Windows RT hardware and software for vertical markets.

    Posted by conradb212 at 6:19 PM

    October 16, 2012

    Windows Surface tablets will be here shortly

    Now this should be interesting. On October 16, 2012, Microsoft announced more detail on its upcoming Windows Surface tablets. And though labeled as a "pre-order" with limited amounts, customers could actually order the Windows Surface RT tablet of their choice from the Surface page on Microsoft's online store. For delivery on or before October 26th, i.e. within ten days.

    So the pricing of the Microsoft Windows RT tablets is no longer a secret. The basic 32GB tablet without a keyboard touch cover is US$499, the touch cover adds a hundred bucks, and the 64GB version with touch cover is US$699. That gets you a Microsoft-branded tablet that's as slender as the iPad, though it weighs a tiny bit more (1.5 vs 1.44 pounds). The Microsoft tablet looks wider because its 10.6-inch screen has a wide-format 16:9 aspect ratio compared to the iPad's 4:3.

    There's a standard USB port (which the iPad doesn't have) and a standard microSD card slot (which the iPad also doesn't have). There's a capacitive touch screen of course, and two 720p cameras, meaning the Surface tablet is for video and not so much for taking pictures (for that you'd want higher res). The 1366 x 768 pixel resolution is more than the original iPad and the iPad 2's 1024 x 768, and it's also what's called 720p in HDTV and video speak, so it should be good for video playback.

    All the expected sensors are there: ambient light, accelerometer, gyroscope and compass, meaning the Surface will be able to do the same tricks customers have come to expect from modern apps. And speaking of apps, the Surface RT tablet comes with Microsoft Office Home and Student 2013 RT (see here). It's not the final, final version, but it'll be a free update when that becomes available.

    There's WiFi and Bluetooth, but no mobile broadband, so these initial versions of Microsoft's RT Surface tablets will need to be within the reach of a WiFi access point to be online. The processor is of the Nvidia Tegra variety, i.e. the type that has been powering the majority of Android tablets out there.

    What's new and different is Windows RT, a version of Windows that runs on ARM processors and doesn't need the presumably more complex x86-based hardware required to run full Windows. What exactly that means remains to be seen. It's said that the Surface RT tablets are aimed at the consumer market, but the iPad was, too, and now it's used almost everywhere. How exactly will Windows RT work? How will it resonate with customers who have come to expect elegant, effortless simplicity from tablets? No one knows just yet.

    And how will it all relate to Surface tablets with full Windows 8, tablets that will, at least the Microsoft Surface versions, look very much like the Surface RT tablets, but have beefier hardware (anything from the latest Atom to third gen Core processors), higher resolution (1920 x 1080), and more storage? Will the two co-exist, with users selecting one or the other depending on their needs? The Windows 8 Pro versions will inevitably cost a good bit more, but how much more can a market bear where consumers have been spoiled with very inexpensive, very powerful notebook computers for years? Much will probably depend on how Windows 8 pans out.

    Finally, what will it all mean to vertical and industrial market tablets? Will there be rugged tablets running Windows RT? Or will the ever-important leverage factor dictate that most enterprise and industrial tablets remain x86-based and compatible with legacy Windows? No one knows.

    So for now I ordered a Surface RT tablet, just to see how it works and what it's all about.

    Posted by conradb212 at 11:28 PM

    October 2, 2012

    Motorola Solutions' acquisition of Psion: Good, bad, or ugly?

    Well, it's done. Psion is now part of Motorola Solutions. On October 12th, 2012, Ron Caines and Frederic Bismuth of Psion and Mark Moon of Motorola Solutions sent out the following note to their customers:

    Dear Psion Customer:

    We are writing to let you know that today Motorola Solutions completed the acquisition of Psion PLC.

    Motorola Solutions is a leading provider of mission-critical communication systems and a pioneer in enterprise mobility solutions. The company has always been focused first and foremost on how to best serve its customers and chose to acquire Psion because of its complementary enterprise mobile computing products and its talented people who understand this highly specialized business. We are excited about what this opportunity brings you as a valued Psion customer. Bringing the Psion family of products onboard allows Motorola Solutions to extend its portfolio and better serve customers by delivering solutions in expanded use cases, especially in warehousing, cold chain, ports, yards and specialized modular applications.

    Integration of the two companies has only just begun today. There will be no immediate changes to your account management, the partners that serve you or the products and services you receive from Psion. Customers who previously purchased or will purchase Psion products can be assured their products will be fully serviced and supported for the full duration of the contracts. All customer support numbers also remain the same.

    Furthermore, Motorola Solutions is committed to investing jointly around its and Psion's technical strengths and capabilities to deliver compelling solutions for the various applications and markets that both Motorola Solutions and Psion have served.

    Once we have worked through the details of the integration, we will share those plans with you. You can be assured that throughout this process we will remain focused on building on Psion's relationship with you and serving all of our customers.

    If you have any questions, please contact us or your Psion representative. Thank you for your continued loyalty and support.

    With many of the smaller, independent manufacturers of rugged computing equipment being swallowed up by larger companies, this was perhaps inevitable. To many rugged computing enthusiasts and insiders, also inevitable is the question "why?" as there is rather substantial product line overlap between the two companies. In an informal conversation, a Motorola source said that the acquisition of Psion was adding complementary handheld products and vehicle-mount terminals that complement Motorola's offerings. The acquisition, the source said, also supports their international growth strategy by providing an attractive, global-installed base.

    That's certainly true, by if the history of such acquisitions has shown anything, it's the latter reason rather than the former. As is, purchased product lines almost inevitably get absorbed. They may live on for a while, but in the longer run it makes no sense to carry duplicate lines. That's too bad as Psion was really on to something with their modular approach to rugged handheld computing platforms. What will become of the innovative ikôn, Neo, and Omnii? The tough WorkAbouts? The panels that still have the old Teklogix' DNA?

    So for now, we reflect on what was. Through Pen Computing and RuggedPCReview.com we covered Psion for a very long time. First those really terrific little clamshell handhelds that were better than anything based on Windows CE at the time, then the acquisition of Teklogix in 2000 (I was at the press conference in Chicago when it was announced), the Psion netbooks way before the world bought tens of millions of "netbooks," and always the rugged handhelds. We had a close relationship with Psion most of the time; at some point we even had a "Psion PSection" in Pen Computing Magazine (with some of the columns still online at pencomputing.com/Psion/psection.html).

    So here's hoping that Moto Solutions will aim for, and succeed in, creating the synergy that is always given as the reason for an acquisition. After all, Moto's own for Symbol Technologies is well aware of the good (its own flourishing after being acquired by Moto), the bad (Intermec > Norand), and the ugly (Symbol > Telxon).

    Posted by conradb212 at 3:11 PM

    August 31, 2012

    "The Windows Marketplace for Mobile for windows mobile 6.x devices is closing"

    "The Windows Marketplace for Mobile for windows mobile 6.x devices is closing" -- that was the title of a March 8, 2012 entry at answers.microsoft.com. In it, it said among other things, "Beginning May 9, 2012, the Windows Mobile 6.x Marketplace service will no longer be available. Starting on this date, you will no longer be able to browse, buy or download applications directly on your Windows Mobile 6.x phone using the Windows Mobile 6.x Marketplace application and service." Signed The Windows Phone Team (with "Ready for a new phone? Explore the latest Windows Phones -- now with over 60,000 applications and games available!" in their signature). I mean, the fact that the announcement was made by the Windows Phone team, whose job it is to replace Windows Mobile, and not whoever is responsible within the Windows Embedded contingent tasked with presiding over Windows Embedded Compact speaks volumes.

    Good Grief.

    What was Microsoft thinking? The one saving grace of what's left of Windows Mobile or Windows Embedded Compact, or whatever it's called these days, was the Windows Marketplace from which you could download apps directly into the device. Whenever I got a new Windows Mobile device for testing, the first thing I always did was download a few essentials, such as Google Maps, Bing, Facebook, Handmark's ExpressNews, a couple of utilities and converters, etc. Now you can't even do that anymore.

    It's as if Microsoft (or whatever feuding faction within Microsoft presides over the demise of Windows Mobile these days) had dropped even the last ounce of pretense that they intend to maintain Windows Mobile as a viable contender to iOS and Android. Windows Mobile never was that, of course, but the nicely done Marketplace at least let long-suffering users personalize their devices to some extent. No more.

    That is truly regrettable. I don't think anyone ever loved Windows Mobile, but fact is that even today, in 2012, the vast majority of industrial and vertical market mobile hardware still runs one version of Windows Mobile or another. By ditching the Marketplace, Microsoft now has made sure that Windows Mobile devices are truly usable only via 100% custom-designed software that mostly avoids the OS interface altogether.

    That is not a happy situation for all the rugged hardware vendors who have faithfully designed, manufactured and marketed innovative, reliable, high quality devices for all those years, and now are saddled with an ancient software platform that is neither supported properly by Microsoft, nor competitive against newer platforms, even those incompatible ones from Microsoft.

    Posted by conradb212 at 4:56 PM

    August 11, 2012

    Performing under pressure

    As I am writing this, the London Olympic games are coming to an end. What two weeks of intense competition proved again is that winning means meticulous preparation, at times a bit of luck, and always the ability to perform under pressure. The latter made me think because rugged computers are all about the ability of a piece of equipment to perform under pressure. Pressure as in heat, cold, dust, rain, sun, and whatever else may keep a system from running at peak efficiency.

    Ruggedness testing is designed to determine if systems hold up under pressure, but are the tests really meaningful? Many probably are. If, for example, a system is dropped a number of times from a certain height and still works afterwards, chances are it'll survive similar drops out there in the field. But are all tests as meaningful?

    A while ago a manufacturer of rugged computers challenged us to test computing performance not just in an office environment, but also over the entire listed operating temperature range. We did, and not surprisingly, the machinery supplied by that company passed with flying colors, i.e. it ran through the benchmarks as fast at freezing and near boiling temperatures as it did at the 72F we usually have in the test lab.

    But, as we subsequently found out, that seems to be the exception. We've been doing benchmark testing on some other rugged devices under thermal stress, and the results are reason for concern. If a rugged handheld, laptop or tablet is supposed to be used out in the field, it's reasonable to assume it'll be asked to perform at peak efficiency at temperatures one might likely encounter outdoors or on the job. Depending on where you are, that might easily include temperatures well over 100 degrees. Such work may well include prolonged exposure to the sun where it may heat up beyond ambient temperature. If it is 105 degrees outdoors, temperatures may easily reach 115 or 120 degrees or even higher if you sit the device down somewhere, or even if it's left in a car. So what happens to performance then? Can the device perform under pressure?

    Turns out, not all can.

    Running our standard benchmarks after leaving rugged systems out in the California summer sun showed performance drops of 50 to 80%. That's pretty serious. Is it acceptable that a piece of equipment that's supposed to be used outdoors then runs only at a fraction of the speed or even at half speed? I'd say not. Think of the potential consequences. Tasks may take between twice to several times as long, potentially affecting critical decisions.

    Is it reasonable to expect full performance under extreme conditions? Not necessarily. Extreme conditions can have an impact on electronics, and there may be justifiable, reasonable precautions to limit performance so as to safeguard the equipment and its life. But is it acceptable to see performance drop to a fraction at the limits of a listed operating temperature range? It's not. Customers should know what level of performance they can expect when the going gets tough.

    Like at the Olympics, performance under pressure separates the rugged system winners from the also-rans. This really needs to be addressed.

    And it's not a simple issue. Complex electronics such as processors have sophisticated internal power management. Boards have sensors that report temperatures to control mechanisms that then may throttle system performance. Firmware and the OS may also monitor environmental situations and then engage fans or throttle performance. The hardware itself may have inherent design limitations. Variables such as Glass Transition Temperature, or Tg, come into play. Tg is the temperature at which polymer materials go from a glassy state to a rubbery state. The types of capacitors used matters. Conformal coating can protect boards. HALT testing can predict real life reliability better than the simple mean time between component failures. And so on.

    All of this is standard practice in embedded systems design. It should be fully and universally adopted in rugged mobile system design as well.

    Posted by conradb212 at 4:46 PM

    June 26, 2012

    Microsoft's entry into tablet hardware a result of partner failure?

    Ever since Microsoft provided a glimpse at a couple of "Surface" tablet hardware prototypes, some in the media are describing Microsoft's apparent entry into the hardware market as a result of Microsoft hardware partner failure. As if, somehow, the combined might of the world's computer manufacturers failed to come up with tablet hardware good enough to do Windows justice.

    Nothing could be farther from the truth.

    The reason why Windows-based tablets never were a major commercial success lies squarely in Microsoft's corner, and not in that of the hardware partners. For stating the very obvious: Windows has never been a tablet operating system. It was designed for use with a keyboard and a mouse. It does not work well with touch, and it did not work well with pens.

    If anything, hardware partners went out of their way with innovative ideas and products to make Windows work in Microsoft-mandated tablets. And let's not forget that it was Microsoft itself that, well into the lead-up to the 2002 Tablet PC introduction, began pushing convertible notebooks rather than tablets. Apparently, the company had so little faith in its own Tablet PC project that it seemed safer to introduce the Tablet PC Edition of Windows XP on a notebook with a digitizer screen rather than a true tablet. That of course, made tablet PCs bigger and bulkier and more expensive.

    Let's also not forget that Microsoft mandated an active digitizer for the 2002 Tablet PC because active pens better emulated the way a mouse (and with it, Windows) worked. Touch was definitely not part of the Tablet PC.

    Microsoft's hardware partners did the absolute best they could within the great constraints of the Windows OS. In the 1990s, companies like GRiD, Fujitsu, Toshiba, NEC, IBM, Samsung, Compaq and many others came up with numerous tablet computer solutions trying to somehow make Windows work in smaller, lighter, handier platforms without physical keyboards. In the 2000s, a whole roster of hardware partners came up with tablet and tablet convertible hardware when Bill Gates proclaimed that by 2006, tablets would be the most popular form of PCs in America. They (Motion Computing, Fujitsu, Acer, Toshiba, Panasonic, etc.) invested the money and they carried the risk, not Microsoft.

    Add to that the unsung heroes of the tablet computer form factors, the companies that made all those vertical market tablets for applications where it simply wasn't feasible to carry around a big laptop. They made do with what they had on the operating system side. And they did a remarkable job.

    To now complain about "partner failures" is simply asinine. And given that even now, hardware partners will have to decide whether to bet on x86 Windows 8 or ARM Windows RT, will they again be blamed if one or both flavors of Windows 8 fail to make inroads against the iPad and Android tablets?

    Posted by conradb212 at 10:47 PM

    June 21, 2012

    Windows Phone 8...

    Sometimes I wish I could be a fly on the wall to listen in when Microsoft's mobile folks make their decisions.

    I mean, a few years ago they found themselves in a position where, against all odds, their erstwhile omnipotent foe Palm collapsed and left Windows Mobile as the heir apparent. So did Microsoft take advantage of that? Nope. Instead, they failed to improve their mobile OS in any meaningful way, all the while confusing customers by endlessly renaming the thing. And handing leadership over to the phone companies.

    Then Apple comes along and shows the world how smartphones are supposed to be. Well, apart from grafting a Zune-like home screen, Microsoft did virtually nothing to advance Windows CE from its mid-1990s roots. Then they come up with Windows Phone 7, which is a whole lot better, but completely incompatible with any earlier Windows CE/Windows Mobile devices and software.

    While Phone 7 and the Phone 7.5 update were billed as the future, apparently they weren't as now there will be Windows Phone 8, which is.... completely incompatible with Phone 7/7.5. And why? Because Phone 8 will supposedly share the same Windows kernel that "real" Windows has (though presumably not the ARM versions). So if Windows 7/7.5 still had Windows CE underpinnings, why were those versions not compatible at all with earlier Windows CE/Windows Mobile versions? It's just all so confusing.

    And about the shared Windows kernel: Wasn't the very idea of Windows everywhere why Windows failed in so many areas that were not desktop or laptop?

    In this industry, one absolutely never knows what's going to happen. Palm was considered invincible, Transmeta was supposed to succeed, Linux was to be the next big thing, the iPhone and then iPad were widely derided as lacking and a fad when they were first introduced, and Android was certain to quickly challenge iOS in tablets. So perhaps Windows Phone 8 will somehow become a success, but then why baffle the public with Windows 8 for the desktop, Windows RT, which isn't quite Windows, for ARM tablets, two versions of "Surface" tablets, and then Windows Phone 8 devices that share the Windows kernel but are somehow separate anyway?

    Go figure.

    Posted by conradb212 at 8:19 PM

    May 30, 2012

    Android finally getting traction in vertical and industrial markets?

    Just when Windows 8 is looming ever larger as perhaps a credible competitor to iOS and the iPad, we're finally starting to see some Android action in vertical market tablets and handhelds. It's timid, exploratory action still, but nonetheless a sign that the industry may finally break out of the stunned disbelief as Apple was first selling millions and then tens of millions of iPads.

    What has changed? Perhaps it's the fact that it's becoming increasingly harder to argue against Android as a serious platform now that Google's OS dominates the smartphone market. Though it seems more fragmented than ever, Android is now on hundreds of millions of smartphones, and all of them are little mobile computers much more than phones. The fragmentation is certainly an issue as is the large variety of mobile hardware Android runs on, but it's also a trend and sign of the time. Cisco recently published the results of a study which showed that 95% of the surveyed organizations allowed employee-owned devices, and more than a third provided full support for them. It's called the "Bring Your Own Device" syndrome, and for Cicso it was enough to ditch its own Cius tablet hardware. What it all means is that people will want to use what they own, know and like, and in tablets and handhelds that's iOS and Android.

    There's also been movement on the legal front. Oracle had been suing Google for patent infringement over some aspects of Android, and since Oracle is a tenacious, formidable opponent in whatever they tackle, this cast a large shadow over Android. Well, Google won, for now at least, when a jury decided Google had not infringed on Oracle's patents.

    So what are we seeing on the Android front?

    Well, there's DRS Tactical Systems that just announced two new rugged tablets with 7-inch capacitive touch displays. They look almost identical, but they are, in fact, two very different devices. One runs Android, one Windows, and DRS made sure the hardware was fully optimized for each OS, with different processors, different storage and different controls. That's costly, and it shows that DRS sees Android as having just as much of a chance to be the platform of choice in mobile enterprise applications as does Windows.

    There's Juniper Systems which revealed that its unique 5.7-inch Mesa Rugged Notepad will soon be available in an Android version called the RAMPAGE 6, courtesy of a partnership with Pennsylvania-based SDG Systems. The Juniper Mesa is powered by the ubiquitous Marvell PXA320 processor. If the Android version uses this same chip, we'd finally have an answer to the question whether the PXA processors that have been driving Pocket PCs and numerous industrial handhelds for a decade can run Android (we asked Marvell several times, to no avail).

    The folks at ADLINK in Taiwan have been offering their TIOT handheld computer in two versions since late 2011; the TIOT 2000 runs Android, the identical-looking TIOT 9000 Windows CE. Here, though, the Android model runs on a Qualcomm processor whereas the Windows CE model has a Marvell PXA310.

    General Dynamics Itronix has been playing with Android for a couple of years now, demonstrating their Android-based GD300 wearable computer to military and other customers. Panasonic introduced their Toughpad to great fanfare at Dallas Cowboy Stadium in November of 2011, but though the rather impressive tablet seemed ready back then, it actually won't start shipping until summer of 2012. Motorola Solutions also announced an Android tablet late in 2011, but I am not sure if the ET1 Enterprise Tablet is in customer hands yet.

    Mobile computing industry veterans may recall that there was a similarly confusing era several technology lifetimes ago: back in the early 1990s the upstart PenPoint OS platform came on so strong that several major hardware companies, including IBM, shipped their tablets with PenPoint instead of Microsoft's unconvincing pen computing overlay for Windows. Microsoft, of course, eventually won that battle, but Microsoft's "win" also demoted tablets back into near irrelevance for another decade and a half. Will it be different this time around? No one knows. Microsoft dominates the desktop, as was the case back then. But unlike PenPoint which despite its hype was known only to a few, hundreds of millions are already familiar with Android.

    The next six months will be interesting.

    Posted by conradb212 at 10:10 PM

    May 2, 2012

    The widening gulf between consumer and vertical market handhelds

    Almost everyone has a smartphone these days. Smartphones are selling by the tens of millions every quarter. In Q1 of 2012, Apple and Samsung sold over 30 million smartphones each. Smartphones have become part of modern life. Everyone is tapping, pinching and zooming. Everyone except those who need a rugged smartphone. Because there isn't one.

    Now to be fair, there are rugged smartphones and any number of ruggedized handhelds that add phone functionality to a handheld computer that can also scan and do all the things people who work in the field need to do on the job. Except, they really aren't smartphones. Not in the way consumers have come to expect smartphones to be. Why is that?

    Because ever since 2007 when Apple introduced the iPhone, there's been a widening gulf between consumer phones and the devices people use at work. Before the iPhone, cellphones had a bit of rudimentary web functionality and a number of basic apps. Nothing was standardized and everyone rolled their own. Professional handhelds almost all ran Windows Mobile, which had had very good phone functionality as early as 2002. But Windows Mobile never really took off in the consumer market.

    Why did the iPhone change everything? Because it introduced a fluid, elegant way of using and interacting with the phone that resonated with people and made total sense. Almost no one wants to first pull out a plastic stylus to then operate a clumsy mini version of a desktop OS. But just lightly tapping at a screen, drag things around, and effortlessly zoom in on what was too small on a tiny phone display, that's an entirely different story. One that Google quickly copied with Android, and one that Microsoft did not, or not until it was too late.

    As a result, smartphones took off on a massive scale, one much grander than anyone had anticipated. And it was the sheer, simple elegance and functionality of just having to lightly tap, swipe, pinch and zoom that did it. Which, in turn, came from Apple's primary stroke of genius, that of using capacitive multi touch.

    The rest is history. Since 2007, Apple's sold hundreds of millions of iPhones. And there are hundreds of millions of Android smartphones, with vendors selling Android-based smartphones combined having a larger market share than Apple.

    With all of this happening and perhaps half a billion handhelds being sold in just five short years, how did the vertical market respond? How did it benefit from the riches, the opportunities, the breakthrough in acceptance of handheld technology that the vertical market had been waiting for?

    It didn't.

    Ruggedized handhelds still run Windows Mobile in a form virtually unchanged from the days before Android and the iPhone. There is no multi-touch. There is no effortless tapping and panning and pinching and zooming. There is no apps store (there was one, but Microsoft closed it).

    And worse, there is no upgrade path. Windows Mobile, which Microsoft merged into its embedded systems group a while ago, seems frozen in time. But isn't there Windows Phone 7, that's now Phone 7.5 and is currently heavily promoted with the launch of the Nokia Lumina 900 smartphone? There is, but Windows Phone is totally different from Windows Mobile. There is no upgrade path. And even if there were, it's a market where there are already half a billion iPhones and Android smartphones, and people who know how to use them and who expect nothing less. Not in their personal lives, and not on the job.

    That is a definite problem for those in the market of making and selling ruggedized handhelds. And the problem is not demand. With the world now pretty much convinced that handheld computing and communication devices are tremendously useful and will only become more so, no one needs to be sold on the merits of handheld technology on the job. Everyone knows that already.

    The problem is that the business market now wants smartphones that are a little (or even a lot) tougher than a consumer phone, and perhaps can do a few things consumer phones don't do so well, like scanning. But the market wants that extra toughness and those extra capabilities without giving up the elegant, effortless user interface, the bright high-res displays, and the ability to take pictures and HD movies so good that consumer smartphones are now replacing dedicated digital cameras.

    And that's why it is becoming increasingly difficult to sell handhelds that offer technology and functionality that is by now very dated by consumer smartphone standards. Sure, the technology and functionality of most ruggedized handhelds are as good and better as they were six years ago, but the world has changed. Sure, the vaunted Microsoft leverage argument ("You use Microsoft in your business, so Windows Mobile fits right in and you can leverage your existing investment") still applies. But that is no longer enough. Businesses who need to equip their workers with rugged handhelds now want more.

    But isn't the mere popularization of handheld technology enough to make rugged technology vendors make a good living? Perhaps. It all depends on the type of business and its inherent profitability. But is basically standing still a good business strategy in a technology boom measuring in the hundreds of millions of consumer handhelds? And are the largely flat financials of rugged handheld makers not a warning sign?

    There are many possible scenarios. For example, perhaps we're seeing a total separation of consumer and vertical markets, one where consumer handhelds get ever more powerful while much more rugged vertical market computers pursue a small niche where they simply won't ever be challenged by consumer technology. And perhaps Microsoft will manage to somehow leverage a successful unified Windows 8 Metro-style user interface into handhelds that can become the true successor of Windows Mobile, with whatever benefits customers see in remaining within the Microsoft fold. And perhaps there really is an insurmountable challenge in making capacitive multi-touch suitable for rugged applications (this is often voiced as a reason, though I can't quite see it).

    But there are also darker scenarios that bode less well for the verticals. If consumer phones aren't tough enough or don't have certain peripherals, third parties may simply make rugged cases and enclosures to make them tough, and sleeves and caddies to add whatever functionality business customers want. Without losing the performance and capabilities of a consumer smartphone. In that case, what could and should have been a golden opportunity for vertical and industrial handheld makers might simply vanish as consumer technology eats their lunch.

    As is, it's become somewhat painful to see vertical market companies struggle, companies that know so well how to make products that hold up under trying circumstances, products that don't leak, products with displays that can be read in bright sunlight, products that will last years rather than months, and products that are tailor-made so well for very specific needs. Those companies have a lot of valuable expertise and so much going for them.

    But will all that be enough to mask and make up for an increasingly wider gulf between vertical market and consumer market technology? Only time can tell, and it may be running out.

    Posted by conradb212 at 4:59 PM

    April 24, 2012

    e-con Systems executive explains the reality of cameras in rugged computers

    A little while ago I had an email conversation with the folks at e-con Systems. They are an embedded product development partner with significant expertise in camera solutions in the Windows CE and Windows Embedded space. The company offers a variety of lens and camera modules that can be interfaced with most of the common handheld processors from TI, Marvell, FreeStyle and others. My interest was, as I discussed in earlier RuggedPCReview.com blog entries, why at a time when every new smartphone includes a superb camera capable of full HD 720p or 1080p video, the cameras built into rugged devices lag so far behind.

    Here is what Mr. Hari Shankkar, co-founder and VP of Business Development of e-con Systems had to say:

    "We have worked with several rugged handheld manufacturers and they use our device driver development services or our camera modules. Based on this experience and our interactions with them, here are our comments:

    • There is a big difference in the way rugged computers are constructed and devices such as digital cameras or smartphone are built.
    • The bulk of the NRE effort goes to making the device rugged and only a very small percentage is left when it comes to the camera. In the case of a digital camera or a cell phone this is not the case as the cameras are given higher importance.
    • These devices are sold through tenders and it is mostly B2B (business-to-business) and not B2C (business-to-consumer) like the cell phone cameras and the digital cameras. The request for quantities is low, like a few hundred per month or per quarter. We have personally not seen these tender documents but from what we have been told, the emphasis is given more to the ruggedness than to the camera side. The camera is needed but customers are more concerned about the resolution of the pictures and whether they can capture 1D/2D barcodes with it.
    • Some of the cameras with ISPs (image signal processors, for backend digital processing) don’t work at very low temperatures; only raw sensors work at such low temperatures. This means you have to have an external ISP on the board. But some of the manufacturers prefer to have the ISP developed in software and not have any hardware ISP. The digital cameras and the cell phone cameras have ISP integrated externally for high resolutions. This is one of the reasons you don’t see a rugged computer with a 8MP or a 14MP very often. Currently, the 8MP and the 14MP are raw sensors and no one has a ISP built in.
    • The image captured by the camera from a sensor can vary between the lens choices. A glass lens will give better quality than the plastic lens. However, we see most of the vendors going with camera modules having plastic lenses which of course affects the quality of the images you are capturing.
    • As long as the end customer demand is not that great for cameras, this will be like this. We see that integration of global shutter cameras (required for capturing stills when you are capturing a fast moving object) or integration of a glass lens not in the immediate future."

    So what Mr. Shankkar is saying is that a) rugged manufacturers concentrate on the basic design to the extent where the camera is usually an afterthought (and our internal examination of most rugged designs confirms that), that b) there are some image signal processing issues that complicate matters for rugged applications, and that c) in the absence of higher customer demand, the quality of imaging subsystems in rugged designs is going to remain as is.

    Those are certainly logical reasons, and as a provider of imaging solutions for handhelds and other devices, Mr. Shankkar is familiar with the thought process and priorities of rugged equipment vendors. And e-con Systems certainly has a roster of very competent camera modules (see e-con Systems camera modules).

    Nonetheless, I cannot help but see a widening disconnect between rugged computer manufacturers and the digital imaging industries here. Integrating the imaging quality and functionality of, say, a US$200 GoPro Hero 1080p hybrid video camera into a high-end rugged data capture device simply ought to be doable. And if I can take superb high definition pictures and 1080p HD video with a 5-ounce iPhone 4s, the same ought to be doable in a rugged handheld or tablet. Yes, it would add cost, but these are not inexpensive devices, and the precision data capture requirements of many vertical market applications deserve no less than what any smartphone camera can do.

    Posted by conradb212 at 10:50 PM

    April 18, 2012

    The nature and potential of Windows 8 for ARM devices

    Well, Microsoft announced in its Windows Blog (see here) that there will be three versions of the upcoming Windows 8. For PCs and tablets based on x86 processors, there will be plain Windows 8 and and the more business-oriented Windows 8 Pro that adds features for encryption, virtualization, PC management and domain connectivity. Windows Media Center will be available as a "media pack" add-on to Windows 8 Pro. A third version, Windows RT, will be available pre-installed on ARM-based PCs and tablets. Windows RT will include touch-optimized desktop versions of Word, Excel, PowerPoint, and OneNote.

    That, mercifully, cuts down the available number of Windows 8 versions from five in Windows 7 (Starter, Home Basic, Home Premium, Professional, and Ultimate) to just three, if you don't count additional embedded and compact varieties.

    While Microsoft's April 16 announcement on the versions was interesting, what's even more interesting is a long entry in Microsoft's MSDN blog back on February 9. It was called "Building Windows for the ARM processor architecture" (see here) and provided an almost 9,000 word fairly technical discussion of the ARM version of Windows 8. That one shed some light on how Microsoft intends to implement and position the next version of Windows, and make sure Windows won't be irrelevant in what many now term the "post PC" era.

    As you may recall, Microsoft's initial Windows 8 announcements were a bit odd. Microsoft called Windows 8 "touch first" and made it sound as if Windows 8 were a totally multi-touch centric OS. While that certainly sounded good in a world awash in iPads, it seemed exceedingly unlikely that all those hundreds of millions of office workers would suddenly switch to touch devices. One could really only come to one conclusion: Windows 8 would most likely work pretty much like Windows 7 and Windows XP before it, but hopefully also somehow incorporate touch into the vast Microsoft software empire.

    The MSDN blog goes a long way in explaining much of what we can expect. It's difficult to condense the very long post into some of the important basics, but it goes something like this:

    Windows on ARM, which was originally called Windows WOA but was then renamed to Windows RT in the April announcement, should feel as much as standard Windows 8 as possible. To that extent, while the ARM version cannot run legacy Windows software, there will be a Windows desktop with the familiar look and feel, and also a lot of the familiar Windows desktop functionality.

    Microsoft also emphasized that Windows RT will have a "very high degree of commonality and very significant shared code with Windows 8." So why can't it run legacy Windows software? Because, Microsoft says, "if we enabled the broad porting of existing code we would fail to deliver on our commitment to longer battery life, predictable performance, and especially a reliable experience over time."

    That, however, doesn't mean there won't be Microsoft Office on the ARM version of Windows. In fact, every Windows ARM device will come with desktop versions of the new "Office 15," including Word, Excel, PowerPoint and OneNote. Will the ARM version of Office be different? Microsoft says that they "have been significantly architected for both touch and minimized power/resource consumption, while also being fully-featured for consumers and providing complete document compatibility." What that means remains to be seen. After all, the Windows CE/Mobile "Pocket" versions of the Office apps were also called Word, Excel, PowerPoint and OneNote, but with just offering a small fraction of the desktop versions' functionality.

    From a cost point of view, x86 Microsoft Office runs between US$119 (Home and Student) to US$349 (Office Professional). Considering that Windows RT devices will likely have to be very price-competitive with iPads and Android tablets, including Office will put an additional cost burden on Windows ARM devices.

    Now let's take a broader look at Windows RT and how it'll differ from standard x86 Windows 8. First of all, you won't be able to just buy the Windows RT OS. It only comes already installed on hardware. That's really no different from Android, and the reason is that the operating system on ARM-based devices is much more intertwined and optimized for particular hardware than x86 Windows that pretty much ran on any x86 device.

    Microsoft also stated that it has been working with just three ARM hardware platform vendors, those being NVIDIA, Qualcomm and Texas Instruments. There are, of course, many more companies that make ARM-based chips and it remains to be seen whether other ARM vendors will remain excluded or if they, too, will have access to Windows RT. As is, while Windows has always predominately been x86, Microsoft occasionally also supported other processor platforms. For example, early Windows CE was considered a multi-processor architecture. Back in 1997, Windows CE supported Hitachi's SuperH architecture, two MIPS variants, x86, the PowerPC, and also ARM.

    Another difference between the x86 and the ARM version of Windows 8 is that "WOA PCs will be serviced only through Windows or Microsoft Update, and consumer apps will only come from the Windows Store." So while x86 versions of Windows 8 application software will likely be available both through a Windows Store or directly from developers, Windows 8 ARM devices will follow the Apple app store model. That, of course, has significant control and security implications.

    A further difference between Windows 8 x86 and ARM devices will be that while conventional x86 hardware likely continues to have the traditional standby and hibernation modes, ARM-based Windows devices will work more like smartphones and tablets that are essentially always on.

    Now for the big question: How does Microsoft intend to bring Windows to such wildly different devices as a desktop PC and a tablet without falling into the same traps it fell into with earlier tablet efforts that were never more than compromises? In Microsoft's vision, by adding the WinRT, a Windows API that handles Metro style apps. From what I can tell, if a Metro application (i.e. one that only exists in the tile-based Metro interface) completely adheres to the WinRT API, then it can run both on ARM devices and also on x86 devices under their Metro interface.

    What does that mean for existing software that developers also want to make available on ARM devices? There are two options. First, developers could build a new metro style front end that then communicates with external data sources and communicates through a web services API. Second, they could reuse whatever runtime code they can within a Metro environment. Either way, the old Windows leverage argument ("staff and developers already know Windows, so we should stay with Windows") won't be as strong as the WinRT API and Metro interface are new. How that will affect business customers who simply wish to stay with Windows instead of using iPads or Android tablets is anyone's guess.

    I must admit that having gone though Windows for Pen Computing (1992), the Windows Pen Services (1996), and then the Windows XP Tablet PC Edition (2001), I am a bit skeptical of Microsoft's approach to Windows RT. It still feels a lot like hedging bets, cobbling yet another veneer on top of standard Windows, and claiming integration where none exists.

    In fairness, the iPad has the same issues with Mac OS. The iPad is fundamentally different from a desktop iMac or even MacBook, and I am witnessing Apple's attempts at bringing the Mac OS closer to iOS with a degree of trepidation. But the situation is different, too. Microsoft's home base is the desktop and it now wants (and needs) to find ways to extend its leadership into tablets and devices, whereas Apple found a new and wildly successful paradigm that flies on its own and only loosely interfaces with the desktop (where most iPad users have Windows machines).

    Bottom line? For now, while Windows 8 will undoubtedly do very well for Microsoft on the desktop and on laptops, it remains far from a certain slam dunk on the tablet and devices side. As I am writing this, Microsoft, AT&T and Nokia are on an all-out campaign to boost Windows Phone with the Nokia Lumina 900, but considering the massive head start the iPhone and Android have, nice though it is, Windows Phone remains a long shot. Windows RT will likely encounter a similar situation.

    One possible outcome may be that Windows RT will lead to a resurgence of the netbook syndrome. Netbooks sold on price alone, though they were never very good. Low-cost Metro devices might pick up where earlier gen netbooks left off, with multi-touch and lots of post PC features, but still nominally being Microsoft and having Office.

    Posted by conradb212 at 5:06 PM

    April 16, 2012

    Will GPS drown in commercialism?

    There are few technologies that have changed our lives and work as fundamentally as GPS. Not so very long ago, if you needed to know where to go, you used a paper map. Today we simply punch in where we want to go, then listen to directions and monitor our position on the GPS display. And industry, of course, has taken wondrous advantage of GPS, using it to optimize and manage transportation and location-based services to a degree never thought possible. GPS, by any account, is totally crucial to our modern world and society.

    That's why a couple of recent observations worry me.

    The first was when I left for San Francisco International Airport for a recent trip to Europe and my Garmin GPS did not find San Francisco Airport. Flat out did not find it. Not even in the transportation category. What it did find, though, was a hotel close to the airport. And so, since I was already underway and needed to concentrate on traffic, that's what I had to choose as my destination. Which promptly meant that I missed an exit. I have to believe that a Garmin GPS ought to find San Francisco International Airport, but mine didn't. All it coughed up was a hotel nearby.

    After I returned from Europe, I needed to take my son to a local high school for a college orientation. I looked up the location of the college on Google Maps on my iMac and committed it to memory. In the car, I used the Maps app on my iPad, which is by Google, and the iPad drew the route from my home to the school. Except that it wasn't to the school. It was to a "sponsored location" nearby. Yes, the official Maps app on the iPad guided me to a "sponsored location" and not to where I wanted to go. Without telling me. It did place a small pin where I actually wanted to go, but the route it drew was to the sponsor location.

    That is a very dangerous trend. Project it into the future, and you might see a situation where GPS might be as utterly unreliable and frustrating as email is today. Just as we drown in commercial spam, what if GPS apps likewise will drown us in "sponsored locations," making users sift through commercial GPS spam in order to find what we really need? That would make GPS not only useless, but potentially dangerous.

    That, Google, would be evil indeed, and it's already evil that I am guided to a "sponsored location" instead of the clearly defined location I wanted to go to.

    How does that relate to rugged computing? It's pretty obvious. What if commercial hooks begin hijacking routes? What if even official addresses are drowned in sponsored spam locations? Think about it.

    And below you can see the routing to the sponsor location instead of the requested location marked by a pin (click on the image for a larger version).


    Posted by conradb212 at 3:56 PM

    March 8, 2012

    The new iPad -- both challenge and opportunity for rugged market manufacturers

    If you want to sell tablets it's tough not to be Apple. And on March 7, 2012, it got that much tougher. For that's when Apple introduced the next version of the iPad, setting the bar even higher for anyone else.

    Why do I even mention that here at RuggedPCReview.com where we concentrate on computing equipment that's tough and rugged and can get the job done where a consumer product like the iPad can't? Because, like it or not, the iPad, like the iPhone, sets consumer expectations on how computing ought to be done. It does that both by the elegance and brilliance of its execution, and by the sheer numbers of iPads and iPhones out there (Apple has sold 315 million iOS devices through 2011). That pretty much means anything that doesn't at least come close to offering the ease-of-use and functionality of the Apple devices will be considered lacking, making for a more difficult sell.

    Unfortunately for anyone else out there trying to sell tablets, it's been tough. Somehow, while the iPad is simply a tablet, a way of presenting, consuming and manipulating information, it's been remarkably difficult for anyone else to convince customers to select them, and not Apple. Remarkable because Apple, despite its mystique, never managed to even make a dent into Microsoft's PC hegemony, and remarkable because of the number of vocal Apple opponents who shred whatever Apple creates seemingly on principle.

    But let's take a quick look at Apple's latest version of the iPad, called not, as expected, iPad 3, but once again simply iPad.

    No one ever complained about the resolution of the iPad display (1024 x 768), and everyone else stayed around that resolution as well, with lower end products perhaps offering 800 x 480, many using the old 1024 x 600 "netbook" resolution, and higher end products going as far as 1280 x 800 or the wider 1366 x 768. Well, with the new iPad Apple quadrupled resolution to 2048 x 1536, making for a superior viewing experience. Such high resolution is not necessarily needed, but if it's available for as comparatively little as Apple charges for iPads, everything else now looks lacking. And I can definitely see how the super-high resolution could come in very handy for many vertical market applications.

    The new iPad also has two cameras. The new iPads we ordered will not arrive for another week and so I don't know yet just how good they are, but if the iPhone 4s is any indication, they will be very significantly better than what anyone else in the rugged arena has to offer at this point. I've long wondered why expensive, high quality rugged handhelds, tablets and notebooks come with marginally acceptable cameras, and the new iPads will only widen the chasm. The iPad cameras aren't only capable of offering fully functional video conferencing on their large screens, they can also snap rather high quality stills, and they can record 1080p full motion HD video, with image stabilization. And the iPad has the software to go with it. Few could claim this wouldn't come in handy for professionals in the field.

    Advances on the technology side include a faster dual core Apple-branded ARM processor with quad core graphics and 4G LTE wireless broadband. Unless some rugged hardware we've seen over the years, iPads were never underpowered, and with the new chip they'll be snappier yet. And while 4G wireless isn't ubiquitous yet by any means, having it built-in certainly doesn't hurt. And then there's battery life, where the iPad, even the new improved one, wrings about ten hours out of just 25 watt-hours. And the whole thing still only weighs 1.4 pounds.

    Now, of course, the iPad isn't rugged. It's durable and well built, and if you use it in one of its many available cases, it won't get scratched or dented, but it's not rugged. Its projected capacitive multi-touch screen famously cannot be used with gloves, you can't use a pen for when pin-point accuracy is required, and it's not waterproof.

    None of which stopped the iPad from scoring some remarkable design wins in areas and industries that once did not look beyond rugged equipment. The FAA granted American Airlines permission to use iPads to replace inflight manuals and such, and American is deploying 11,000 iPads. Others will follow.

    What does that all mean for companies that make rugged tablets? That the market is there. In fact, I believe the surface has barely been scratched. But it has to be the right product. Apple showed the way with the iPad but, with all due respect to those who've tried so far, few followed with more than a timid effort. It's been mostly wait-and-see, and now Apple has set the bar higher yet. That doesn't mean it's over for anyone else, but it's gotten tougher yet. The new iPad will boost acceptance of the tablet form factor and functionality to higher levels yet, and that still means opportunity for everyone else.

    I am convinced that there's a large and growing demand for a more rugged tablet, and that whoever comes out with a product that doesn't just approximate but match and exceed expectations will win big.


    Posted by conradb212 at 4:34 PM

    January 26, 2012

    A conversation on imaging in rugged handhelds

    Recently I received an email from someone in the industry that concluded with the question: "Wouldn't a conversation on imaging in rugged handhelds be interesting to your readers?"

    The answer, of course, is "definitely," and so I responded as follows:

    "I recently wrote two articles on the general state of imaging in handheld/mobile systems, so you basically know where I stand. In essence, given the very rapid advance in HD still/video imaging thanks to a convergence of CMOS, tiny storage formats, and H.264 compression technology (Ambarella!), it's now possible to generate excellent high resolution stills as well as near perfect 1080p/30 and better video in very small packages, packages that are small enough to fit into handheld and mobile computers.

    "Yet, while we see tiny $200 GoPros and such, and advanced still/video capability in virtually every smartphone, the imaging technology we find in almost all rugged computers, even high-end ones, is lacking. Though we review and examine numerous mobile computers every year, we have yet to find a single one that has hybrid imaging capabilities that come close to what is possible today, and most are, in fact, barely usable. It is inexplicable to me how a $4,000 ruggedized notebook computer or tablet does NOT include competent imaging subsystems. There is room, there is a need, and the costs are not prohibitive.

    "What enables me to make those statements? First, I have been reviewing rugged mobile computing technology for almost 20 years. For the past ten or 15 years, imaging in mobile computers has barely advanced. Second, I co-founded Digital Camera Magazine in 1997 (as the first magazine anywhere to concentrate solely on digital cameras). I continue to follow digital imaging closely and we also do digital imaging reviews as time allows. Third, as an enthusiastic scuba diver (see my scubadiverinfo.com), I have done many underwater imaging product reviews, including a couple on the GoPros (see here). Fourth, in working with several embedded systems vendors, I know what's possible in terms of integration. What I do see is an almost total lack of communication between computer and imaging people.

    "I was not familiar with your company, but I see that you are in part concentrating on camera modules. Which means that you are probably painfully aware of the situation. What must happen is much better integration of much better imaging capabilities into mobile computers. At a time where I can produce near Avatar-quality underwater 1080p 3D video with two GoPros, and where world events are routinely reported on smartphones, mobile computers are woefully out of touch with imaging. A professional who pays $4,000 for a rugged computer (or even just $1,200 for a rugged handheld) should expect no less in terms of imaging quality and ease-of-use than you can get in a cheap digital camera (i.e. sharp pictures, a decent interface, HD video, and speed). Instead, what we currently have in most mobile computers is simply not nearly good enough. You could never rely on it even for quick, reliable snapshots in the field, let alone quality imaging.

    "Think about it: businesses spend a lot of money to equip their personnel with expensive mobile computing equipment. Much of that equipment is used for data capture, sight survey, recording, reporting, etc. It makes zero sense to me to have vast computing power, a great outdoor viewable display, great communication and data capture technology, .... and weak rudimentary imaging that is in no way suitable or sufficient.

    Posted by conradb212 at 9:02 PM

    November 21, 2011

    Ruggedized Android devices -- status and outlook

    As far as operating system platforms go, the rugged mobile computing industry is in a bit of a holding pattern these days. Thanks to the massive success of the iPhone and iPad there is a big opportunity for more durable handhelds and tablets that can handle a drop and a bit of rain, yet are as handy and easy to use as an iPhone or iPad-style media tablet.. On the tablet side, a lot of enterprises like the iPad form factor and ease of use, but they need something a bit tougher and more sturdy than an iPad or a similar consumer product. On the smartphone side, hundreds of millions use them now and expect the same elegance and functionality in the handhelds they use on the job. But again, those professional handhelds need to hold up to abuse and accidents better than your standard consumer smartphone.

    So with dozens and perhaps hundreds of millions of Android smartphones sold, and tens of millions of iPads, why are the likes of Lowe's home improvement center equipping their employees with tens of thousands of iPhones instead of presumably more suitable ruggedized handhelds (see Bloomberg article)? And why do we see iPads being sold into enterprise deployments that used to be the exclusive province of rugged tablets? There isn't one easy answer.

    On the tablet side, it almost looks like the enterprise seems to want iPads and nothing else. Which is a problem for anyone who isn't Apple as the iOS is proprietary and Android-based tablets simply haven't caught on yet. That may be due to the perception that Android is really a phone operating system, or potential customers are befuddled over the various versions of the Android OS.

    On the handheld side where Android has successfully established itself as the primary alternative to the iPhone, it would seem to be easy to offer Android-based ruggedized smartphones and handhelds. But there, too, the majority of recent product introductions still used the by now ancient Windows Mobile, an OS that looked and felt old nearly a decade ago.

    So what gives? A few things.

    With tablets, the almost shocking lack of success of Android and other alternate OS tablets has had a cold shower effect. If neither Motorola Mobility (Xoom) nor RIM (Playbook) nor Hewlett Packard (TouchPad, Slate 500) can do it, who can? And then there's Microsoft's promise to finally getting it right on tablets with the upcoming Windows 8. That's far from certain, but in a generally conservative industry where almost everything is Microsoft, the usual Microsoft leverage/investment/integration arguments carry weight.

    With handhelds and smartphones, it's harder to understand because non-Microsoft platforms have traditionally been far more successful, and in the era of apps, software leverage hardly matters anymore. Perhaps it's Microsoft's heavy-handed forcing Android vendors into paying them, and not Google, royalties. Perhaps it's some sort of fear not to stray too far into uncharted waters. It's hard to say. Almost everyone I talk in the industry admits, off the record, to keeping a very close eye on Android developments.

    So that all said, where do we stand with respects to Android-based products in the vertical/industrial markets where durability, ruggedness and return-on-investment and total-cost-of-ownership matter?

    In tablets, there have been two recent introductions. One is the Motorola Solutions ET1, a small 7-inch display ruggedized enterprise tablet. It's based on a TI OMAP4 processor and runs Android 2.3.4, i.e. one of the "non-tablet" versions. The ET1 was said to be available in Q4 of 2011. RuggedPCReview reported on the device here. The other notable introduction is the Panasonic Toughpad, introduced in November of 2011, but not available until the spring of 2012. The Panasonic Toughpad is a Marvell-powered device with a 10.1-inch screen and runs Android 3.2. Both devices seem to be what a lot of enterprise customers have been waiting for: more durable versions of consumer media tablets, fortified for enterprise use with beefed-up security, service and durability without sacrificing slenderness, low weight and ease-of-use.

    On the handheld side, we've also come across some potentially interesting products. The first is the ADLINK TIOT2000 (see our report), a conventional resistive touch handheld with a QVGA display. What's interesting here is that ADLINK offers a visually identical version, the TIOT9000 (see here) that runs Windows CE, with the Android version using a Qualcomm 7227T processor and the Windows CE version a Marvell PXA310. Winmate just introduced its E430T, an industrial PDA with a large 4.3-inch display that uses capacitive touch. This machine uses a Texas Instruments DM3730 processor and is said to be able to run Android 2.3 or Windows Mobile 6.5. I've also seen Android listed as an alternate OS on some of Advantech's embedded modules, including the TI OMAP 3530-based PCM-C3500 Series (see here).

    On the surface, it would seem to be almost a no-brainer to cash in on the great public interest in tablets/smartphones and the opportunity a new-era OS such as Android provides. But nothing is ever as easy as it seems.

    For example, there's a big difference between traditional rugged tablets that usually either have very precise digitizer pens or a resistive touch screen (or often both), and iPad class devices that use capacitive touch that lets you do all that tapping and panning and pinching, but generally doesn't work in the rain or under adverse conditions. The same issue exists on the handheld side where the traditional Windows Mobile is clearly designed for use with a passive stylus and cannot easily take advantage of capacitive multi-touch. That has, however, not stopped Casio from introducing the IT-300 that has a capacitive multi-touch display, yet runs Windows Embedded Handheld 6.5 (see our report).

    So it's all a bit of a mystery. The transition to new operating platforms is never easy and often traumatic, and there are good arguments for being cautious. For example, in addition to leverage, one of the big arguments for Windows CE/Windows Mobile has always been the wealth of existing software. True, but in a world of tens of thousands of often very slick and sophisticated iOS and Android apps, it's hard to believe developers wouldn't quickly come up with the appropriate versions and apps.

    With tablets, the situation must be quite frustrating for manufacturers of rugged mobile devices. They undoubtedly see a great opportunity to cash in on the tablet boom, but they are to a degree caught between needing to support the existing Windows XP/Windows 7 infrastructure and deciding what to move to next. Microsoft is cleverly dangling a (for them) no-lose carrot in the form of Windows 8's Metro interface where ARM-based devices would only run Metro and have no access to "classic" Windows whereas for X86-compatible devices, Metro would just be the front end. So there are three potential success strategies: Android, Metro-based ARM devices, and X86 tablets that run Metro and classic windows. No one can support all three.

    So for now, as far as rugged tablets and handhelds go, it's the best of times and it's the worst of times.

    Posted by conradb212 at 4:42 PM

    November 2, 2011

    Windows 8: a bit of fear, uncertainty and doubt

    In mid-September 2011, Microsoft showcased a preview of the next release of Windows at the BUILD developer conference. After reading up on it, I wrote the below in the days following the preview, but held off putting it in the RuggedPCReview blog until I had a bit more time to let it sink in and contemplate the likely impact on rugged mobile computing manufacturers and users. My thinking hasn't changed, so below is pretty much what were my first impressions.

    Essentially, Microsoft is offering a touch-optimized front end on the next version of Windows. For ARM devices, the new front end is mandatory, for X86 devices it is not. That's probably not to expose itself to charges that even on ARM devices, classic Windows just doesn't work very well.

    What's a bit puzzling is that Microsoft called Windows 8 "touch-first." I have to assume that refers to the Metro interface only because having all of Windows touch-first would make most existing hardware essentially obsolete, as touch is neither available nor feasible on most desktops and notebooks. If all of Windows 8 would be touch-first, how would people take to a user interface designed for touch when they are sitting in front of a desktop?

    So Microsoft is basically hedging its bets in the tablet space, just as it has before when rival platforms began getting to much attention. Witness...

    In 1991, Microsoft grafted pen extensions on top of Windows 3.1 and called it Windows for Pen Computing. It was a miserable flop, but created enough FUD to stall and kill rivaling efforts (remember that even the original ThinkPad ran PenPoint and every major computer company had a pen tablet).

    In 1995, Microsoft grafted the Pen Extensions onto Windows 95, but essentially left it up to hardware manufacturers to make them work and support them.

    In 2001, Microsoft grafted pen functionality onto Windows XP and called it the XP Tablet PC Edition, forcing most hardware manufacturers to create products for it.

    In 2009, Microsoft added a bit of touch functionality and made it available in Windows 7, proclaiming the OS -- successfully marketed as a rock solid new platform when it to most users it really looked like Vista done right -- as touch enabled.

    In each case, Microsoft's effort created enough FUD to either derail efforts or at least drive OEMs to support them to some extent.

    Now there'll be Windows 8 and once again Microsoft is attempting to ward off a challenge and remain relevant by integrating rival technology with just enough independent thinking to declare it its own.

    So what is Microsoft doing? Think about it. Would Microsoft gamble its still commanding market position on suddenly converting everything to touch? When touch really only works on tablets? When almost all work is still done on desks sitting down? When billions use keyboards and mice? When even Apple is not suggesting touch is the be-all and end-all, and all of OSX and all Macs now work with touch only? When Microsoft just managed to convince the public that Windows 7 is new and solid? When unpleasant memories of Vista still linger? When almost everyone still remembers New Coke? When the idea of having tiles that summarize info from other apps has been tried (in WinMo) years ago? When the last thing IT wants is everyone having Facebook and Twitter built right in?

    Let's be realistic here. What Microsoft is doing is nothing more than trying its Windows Everywhere approach one more time. By promising a new Windows that is so marvelous that nothing else is needed, not on tablets, not on the desktop. That hasn't worked in the past, and it will not work now. What Microsoft so far has shown is an updated version of Windows 7 with a new optional interface. The only new thing is that the interface will be mandatory on ARM-based devices. So that Microsoft won't get criticized again if the touch layer doesn't work well on tablets or just isn't enough to run Windows. This way Microsoft can always refer those who need "real" Windows to an X86 tablet and relegate or even abandon ARM devices should that not work out. If it does work out, great. If not, no big deal.

    Now let's look at tablets specifically. Microsoft's primary argument for Windows on tablets is the leverage, legacy and compatibility proposition that says that corporate IT runs on Microsoft, all the software and software tools are Microsoft, developers know Microsoft, and there are trillions of Microsoft apps. Therefore, Windows based tablets will fit right in. Even if they are a little hard to operate.

    Using the leverage argument, if Metro is indeed a mandatory new interface on ARM-based tablets, then out goes the legacy application argument for tablets. It'll have to be all new apps. And that transition will be as hard or harder than what Windows Mobile users encountered when it was end of the road with WinMo 6.5, and the was only the vague promise of an eventual move to a Phone 7 style system that was not backward compatible.

    So then why not just stay with X86 and the option to run Windows Classic where all the software is and will be? That is going to be the big question. Also, it's been suggested that since developing for both ARM and X86 requires using the Metro UI, that means Metro will be the preferred environment. Will that mean Windows 8 users have to go back and forth between environments? Will we see "compatibility boxes" again?

    There is, of course, always the chance that Microsoft will indeed be able to put forth a credible effort, just as it did with the Windows 7 follow-up to Vista. The Metro interface may just be so compelling that it can stem and turn the tide of what by its introduction may be several hundred million iPads and perhaps Android tablets. A tall order indeed.

    So for now it's Microsoft generating a degree of fear, uncertainty and doubt among hardware manufacturers and corporate customers. It's wise move that was to be expected. And in time-honored Microsoft fashion, it's also a riskless bet where Plan B (Windows classic) is the safe perpetuation of the status quo.

    What does it all mean to makers of mobile and rugged devices? It depends on how serious Microsoft is with the Metro UI and ARM hardware. At this point, mobile hardware either uses Windows Mobile, or Embedded Handheld whatever, or it's using Windows XP or Windows 7 on Core or Atom powered devices. It's hard to see much of a future of Atom powered hardware if ARM-based tablets and handhelds can run Metro faster with fewer resources. In fact, the only reason would be to be legacy compatible, and that is a rather major reason.

    The next issue is touch. It's hard to imagine a next gen Windows not supporting a multi touch interface that uses projected capacitive technology. And that is precisely what the vertical market mobile computing industry currently says it doesn't want because capacitive touch can't handle rain, gloves, or other adverse conditions. And then there's the pen functionality for signature capture and such, or even handwriting recognition. How will pens work in a touch interface (remember, touch has never worked well in a pen interface)?

    For a bit of testing, we installed Windows 8 on an older HP 2710p convertible Tablet PC. The install was easy and pretty much everything worked. From a cold start to Metro takes just under a minute. The HP tablet doesn’t have touch, but the installer recognized the pen just fine. All the swiping has to be done by pen. Clicking on the Start menu brought up Metro with its flat tiles. It all can be made to work somehow, but at this point I think the real question is whether Android can establish itself on tablets or not before Microsoft is ready with Windows 8.

    Posted by conradb212 at 4:36 PM

    August 8, 2011

    Do you have "Grandpa Boxes" in your lineup?

    Unlike Gary Trudeau whose "Doonesbury" strips can be personal and mean-spirited (remember his relentless unfair mocking of the Apple Newton?), Scott Adams' "Dilbert" presents a lighthearted, humorous, yet keenly insightful commentary on the corporate and technical issues of the day.

    In a recent strip (August 3, 2011), Dilbert's working on his computer when a young colleague approaches and asks, "Are you getting a lot done on the Grandpa Box?" "The what?" Dilbert asks. "The people in my generation do our work on our phones and tablets," is the response. "I also have a laptop," Dilbert objects. "I'll text the nineties and let them know," the young gun says (see the strip here)

    This made me think. Is this really happening? Are we really seeing a shift from the computing tools as we know them to a new generation of devices that we didn't really think could do the serious jobs? While it seems almost unthinkable that a smartphone could replace a "real" computer, 30 years ago almost no one thought PCs could ever challenge mainframes or minicomputers, and yet PCs went on to revolutionize the world and doing things no one ever thought they could.

    It also made me think of my own changing pattern of using computers. I use my own smartphone and tablet more and more, and my laptop less and less. I described the syndrome in a serious of lengthy blog posts entitled "iPad on the Road". On my own latest intercontinental business trip, I didn't take along a laptop at all, just my smartphone and tablet.

    I also thought of a period in my life about three years ago where texting was my preferred means of communication, and how immersed in it I became. I got to a point where the shortcuts on the tiny keypad of my phone and its T9 predictive text entry became second nature and I could bang out messages with hardly looking at the keypad at all. I remember thinking that hundreds of millions of people, and perhaps billions, text every day. To them, T9 and similar text entry is second nature. And yet, makers of rugged tablet computers hardly ever include any of those text entry methods. I even suggested it to some, but there never was follow-up.

    Can phones and tablets really do the job of computers as we know them? And is the young generation really doing its work on phones and tablets? I can see it to some extent as I am using Apple's Pages wordprocessor on my iPad, and also FTP, SSL, blog and remote login programs. And that's on top of what media tablets do best, like browsing, email, entertainment, research, etc. And on my most recent trip, Skype on my tablet actually replaced even my phone.

    Does all of that make conventional computers "Grandpa Boxes"? The way I see it now, yes and no. Just like PCs replaced some of the conventional computing of the day and added a huge amount of new and previously unimaginable ways of using computers in everyday life, smartphones and tablets will replace some of the things we're now doing on desktops and notebooks, and add a huge amount of new functionality that we never really thought of.

    This means we may be at the threshold of a new era with both challenges and opportunities. The challenge will be to figure out what all will inevitably be replaced by these emerging computing platforms. The opportunity will be to take advantage of the new platforms.

    For the mobile rugged computing industry this means thinking long and hard which of their products are "Grandpa Boxes" and which continue to fill a real, rational need. And also what part of the smartphone and media tablet revolution to embrace and employ for their own purposes.

    So far, the industry has been timid. The are a few ruggedized smartphones and a couple of "new style" tablets, but no one's really much ventured past the cozy confines of the Wintel world. And the new realm of apps has not yet been discovered by the verticals. What this means is that a giant opportunity remains unexplored, and there's also a danger of simply missing the boat by waiting too long, with new players coming in and taking over.

    That won't necessarily happen as there's much expertise in this industry, but what if suddenly there are apps that can handle business processes on inexpensive yet durable smartphones and tablets the way hundreds of millions already use their smartphones and tablets?

    Do you have Grandpa Boxes in your lineup? If so, does that make sense, or is it time to move on?

    Posted by conradb212 at 4:43 PM

    June 29, 2011

    "The Cloud"

    It's fashionable these days to say that something's "in the cloud." The cloud is in. Everyone's moving stuff to the cloud.

    Which is really annoying.

    "The Cloud," of course, isn't a cloud at all. In fact, it couldn't be farther from a cloud. It's the same old server farms somewhere in a warehouse. That's all. So why the sudden fixation with "the cloud"? Probably because centralized storage and applications can be huge business and because it presents an opportunity to regain control over users and their data, control that has largely been lost ever since the PC revolution took it away from centralized mainframes in the 1980s.

    But isn't it really great not to have to worry about where stuff is stored? And that it'll all be there for you when you need it, wherever that may be? In theory, yes. In practice, not so much. Because it may, or may not be there.

    I learned that lesson yet again when my Amazon account somehow got compromised a little while ago. For all practical purposes, Amazon is in "the cloud" as far as their customers are concerned. Customer data is there, wish lists, old transactions, and all the archived Kindle books. So when Amazon suddenly didn't accept my password anymore I tried to reset it three times, exhausting in the process the passwords I can easily remember.

    A call to Amazon yielded that the account had indeed been compromised, and I was guided through setting it up again. I wasn't told how and why the hacking might have happened, and moving my data was a manual process that had to be done by Amazon. But even Amazon, stunningly, was unable to move my Kindle book library. Instead, they said they'd send me a gift card so that I could purchase the books again. The card eventually arrived.

    Then I found that my Amazon affiliates account was also linked to my main Amazon account, and also no longer worked. Amazon once again changed my password and gave me instructions on how to regain access.

    Bottom line: if even Amazon (or Sony or the government, for that matter) cannot guarantee that your data is safe, or explain what happened when it's compromised, why should I trust "the cloud"? Companies come and go, and some who are now presenting "cloud" services will undoubtedly soon be gone. Others will, in the software industry's inimitable fashion, act as if their service was the only one that matters and make users jump through hoops. And it'll all add to the rapidly growing number of logins and setups and passwords that we are pretty much forced to entrust our lives and financials with.

    While experiences like what happened to my Amazon account are simply annoying and worrysome, what happens if and when it'll all come crashing down? Or if you wake up one day with amnesia, or the cheat sheet with all your access data is lost. The cloud -- poof! -- will be gone, and with it all of our data. That alone is a darn good argument for local storage and backups. Having one's head in the cloud will almost inevitably turn out to be a bad thing.

    Posted by conradb212 at 6:50 PM

    May 24, 2011

    Another conversation with Paul Moore, Fujitsu's Senior Director of Product Development

    I don't often do phone interviews with product managers or PR people when a new product is announced. That's because, for the most part, whatever they can tell me I already know from the press materials. And what I really want to know they usually can't tell me because PR folks, by and large, need to stick to a script and company line. Which means I might as well save the time of a PR call to examine things myself, Google this and that, and then form my own opinion.

    That said, there are industry people I enjoy talking to on the phone. Paul Moore, Senior Director of Product Development at Fujitsu is one of them. Conversations with Paul are always value-added because he not only knows his stuff, but he also has opinions, answers questions, and does not shy away from a good debate over an issue. Like all professionals in his position, Paul must present and defend the party line, but with him you always get a clear and definite position and explanation, and I respect and appreciate that. I may not always agree, and at times it must be hard for someone in his position to argue a point that seems, from my perspective, rather clear. But that's what a good PR person does, and Paul is among the best.

    The occasion of our conversation was the availability of Fujitsu's new Stylistic Q550 tablet, a "business class tablet" first introduced back in February (see my preview). The Q550 represents Fujitsu's initial effort to grab a slice of the tablet market popularized by the iPad, and expected to grow almost exponentially. So far that's turned out to be much more difficult than anyone expected, as Apple's product and pricing are very good, main contender Android just doesn't seem quite ready yet, and Microsoft doesn't have anything specifically for tablets.

    The overall situation is odd. Many millions love the iPad and its effortless elegance, but for certain markets the iPad is lacking. It's not particularly rugged. It's an Apple product in a still largely Windows world. And there's no pen for situations where a pen is needed (signatures, etc.).

    So Fujitsu comes out with the Stylistic Q550 with a nice 10.1-inch screen, and running regular Windows 7 on a 1.5GHz Atom Z670 processor, one of the newest ones. It has multi-touch like the iPad, but also a pen, thanks to N-trig's DuoSense technology. It also has an SD card slot, a Smart Card slot, a fingerprint reader, higher resolution than the iPad (1280 x 800), a brighter backlight, outdoor viewability, and optional Gobi 3000. And it starts at just US$729, which isn't much for a business class machine.

    Paul starts the conversation with reminding that Fujitsu has some 20 years' worth of experience in the tablet market (true, they are the pioneers). That taught them a thing or two. Like that removable batteries are a must; business can't send in product just to replace a bad battery. Then there's all the security stuff corporations need, like biometrics, the TPM module, bitlocker encryption, and compatibility with all the other gear companies already have. And there's also an HDMI port for presentations, a handstrap, dual cams, the Gobi 3000 module so you can use AT&T, Verizon or Sprint, or whatever you want. Business needs all that.

    And that is why when Fujitsu created a next-gen tablet for commercial markets, they based it on Windows 7. That was just a given. "For us, this is a market expander," Paul said, "not just another product."

    That makes sense, even though the market researchers at IHS iSuppli just predicted that iPad-style media tablets will outsell PC tablets by a factor of 10 to 1 through the next four years or so (see here). Paul doesn't debate that point. "Let's face it, Apple owns consumer," he says, "We've always been vertical. We concentrate on usability, screens, ports, security, compatibility, ..." and he adds a half dozen more items and features that separate glitzy consumer electronics from the tool-for-the-job professional stuff.

    Why not Android then? There's allure, and Fujitsu is rumored to introduce a smaller Android-based tablet. Paul quickly cuts to the core of that issue: "No one likes to pay for an OS," he says, and that's certainly an Android attraction. "But Android is basically a phone OS. There are security challenges, different marketplaces, and if all my software is Windows-based, do I really want an Android device?" Good points there, and especially when a business uses custom software. And as for the iPad, it's a "want" device, Paul says. Theirs is a "need" device. All net on that one.

    Then I am pressing on an issue that I consider very relevant. While I have serious doubts that Windows, as is, is well suited for tablets, the compatibility argument is valid. I think Microsoft's leverage-across-all-platforms mantra is not as strong as it once was, but for now it still stands. However, if you make a business class machine, it really should be considerably tougher than a media tablet. Yet, the Q550 is listed with a rather narrow 41 to 95 degree operating temperature range and nothing more. No drop spec, no sealing spec against dust and water, no altitude or humidity specs, nada. Why? Especially when Motion introduced the CL900 which does offer a decent degree of ruggedness.

    Paul says their tablet does not compete in the same class as Motion's. The Motion tablet is heavier and more expensive and really more in the class of an Xplore tablet or such. I cannot agree here. While the Q550 is indeed a bit lighter and less expensive than the Motion tablet, both are essentially Windows-based business class media tablets starting at under US$1,000 whereas fully rugged hardware like the Xplore tablets weigh and cost a whole lot more. I definitely believe commercial markets would like to see a degree of ruggedness, but Paul won't concede the point. Besides, they do have protective cases and such. And Paul's argument that Fujitsu has a long record of building tablets that hold up well is most definitely valid. Paul also pointed out that the Q55 is indeed MIL-STD-810G tested, meeting nine military standard tests for various demanding environmental conditions including transit drop, dust, functional shock and high temperature. I hope they soon add this to the specs.

    Now the conversation moves beyond the new tablet. I ask Paul why Fujitsu, the pioneer in tablets, appears to have discontinued their larger Stylistic slates, a storied line of tablets that went back, uninterrupted, a good 15 years or so. Well, they did stop the last of that line, the Stylistic ST6012, over a year ago because everyone seemed to be transitioning to convertibles, and Fujitsu has many years' worth of experience in that product category, too.

    Why the switch? "Convertibles are less expensive," Moore explained. It's simple physics: having the LCD in one case and the rest of the electronics in another means less complexity, fewer thermal issues, and thus less expensive components. So convertibles turned out to be less expensive, but more powerful and more reliable. Years ago, Fujitsu sold more tablets than convertibles, then the ratio switched. Good information and reasoning. I still think that Microsoft is as much at fault as physics, but in this instance the marketplace spoke, and Fujitsu followed.

    Then I get on a high horse on cameras. The Q550 tablet does have two of them, a front-facing VGA webcam, and a rear-facing 1.4mp documentation camera. I haven't tried out the Q550's cameras yet, and I have no problem with a VGA webcam. But a 1.3 megapixel documentation camera is meager in an era where digital cameras with 14-megapixel sensors and 1080p HD video can be had at Walmart for less than a hundred bucks. Paul says he's had that discussion with his engineers, so no real argument there, other than that true digital camera guts can't easily be built into a slender tablet. I think they can.

    I've been on the phone with Paul Moore for almost an hour and it's time to let him go so he can get ready for his next call. I had a lot of fun. I learned things, I got some good information. And I hung up with the feeling that I had talked to someone who really likes his work and the products he represents. That makes all the difference.

    Thanks, Paul. And thanks, Wendy Grubow, for always keeping us informed about Fujitsu's latest.

    Posted by conradb212 at 12:11 AM

    May 9, 2011

    The problem with benchmarks

    When we recently used our standard benchmark suite to test the performance of a new rugged computer, we thought it'd be just another entry into the RuggedPCReview.com benchmark performance database that we've been compiling over the past several years. We always run benchmarks on all Windows-based machines that come to our lab, and here's why:

    1. Benchmarks are a good way to see where a machine fits into the overall performance spectrum. The benchmark bottomline is usually a pretty good indicator of overall performance.

    2. Benchmarks show the performance of individual subsystems; that's a good indicator for the strengths and compromises in a design.

    3) Benchmarks show how well a company took advantage of a particular processor, and how well they optimized the performance of all the subsystems.

    That said, benchmarks are not the be-all, end-all of performance testing. Over the years we've been running benchmarks, we often found puzzling inconsistencies that seemed hard to explain. We began using multiple benchmark suites for sort of a "checks and balances" system. That often helped in pin-pointing test areas where a particular benchmark simply didn't work well.

    There is a phrase that says there are three kinds of lies, those being "lies, damn lies, and statistics." It supposedly goes back to a 19th century politician. At times one might be tempted to craft a similar phrase about benchmarks, but that would be unfair to the significant challenge of creating and properly using benchmarks.

    It is, in fact, almost impossible to create benchmarks that fairly and accurately measure performance across processor architectures, operating systems, different memory and storage technologies, and even different software algorithms. For that reason, when we list benchmark results in our full product reviews, I always add an explanation outlining the various benchmark caveats.

    Does that mean benchmarks are useless? It doesn't. Benchmarks are a good tool to determine relative performance. Even if subsystem benchmarks look a bit suspect, the bottomline benchmark number of most comprehensive suites generally provides a good indicator of overall performance. And that's why we run benchmarks whenever we can, and why we publish them as well.

    Now in the instance that causes me to write this blog entry, we ran benchmarks and then, as a courtesy, ran them by the manufacturer. Most of the time, the industry's benchmarks and ours are very close, but this time they were not. Theirs were much higher, both for CPU and storage. We ran ours again, and the results were pretty much the same as the first time we ran them.

    The manufacturer then sent us their numbers, and they were indeed different, and I quickly saw why. Our test machine used its two solid state disks as two separate disks whereas I was pretty sure the manufacturer had theirs configured to run RAID 0, i.e. striping, which resulted in twice the disk subsystem performance (the CPU figures were the same). A second set of numbers was from a machine that had 64-bit Windows 7 installed, whereas our test machine had 32-bit Windows 7, which for compatibility reasons is still being used by most machines that come through the lab.

    The manufacturer then emailed back and said they'd overnight the two machines they had used for testing, including the benchmark software they had used (same as ours, Passmark 6.1). They arrived via Fedex and we ran the benchmarks, and they confirmed the manufacturer's results, with much higher numbers than ours. And yes, they had the two SSDs in a RAID 0 configuration. Just to double-check, we installed the benchmark software from our own disk, and on the 32-bit machine confirmed their result. Then we ran our benchmark software on the 64-bit Windows machine, and... our numbers were pretty much the same as those of the machine running 32-bit Windows.

    Well, turns out there is a version of Passmark 6.1 for 32-bit Windows and one for 64-bit Windows. The 64-bit version shows much higher CPU performance numbers, and thus higher overall performance.

    Next, we installed our second benchmark suite, CrystalMark. CrystalMark pretty much ignored the RAID configuration and showed disk results no higher than the ones we had found on our initial non-RAID machine. CrystalMark also showed pretty much the same CPU numbers for both the 32-bit and the 64-bit versions of Windows.

    Go figure.

    This put us in a bit of a spot because we had planned on showing how the tested machine compared to its competition. We really couldn't do that now as it would have meant comparing apples and oranges, or in this case results obtained with two different versions of our benchmark software.

    There was an additional twist in that the tested machine had a newer processor than some of the comparison machines that scored almost as high or higher in some CPU benchmarks. The manufacturer felt this went against common sense, and backed up the conjecture with several additional benchmarks supplied by the maker of the chips. I have seen older systems outperform newer ones in certain benchmarks before, so I think it's quite possible that older technology can be as quick or quicker in some benchmarks, though the sum-total bottom line almost always favors newer systems (as it did here).

    The implications of all this are that our benchmark suites seem to properly measure performance across Windows XP, Vista and 7, but apparently things break down when it comes to 64-bit Windows. And the vast discrepancy between the two benchmark suites in dealing with RAID is also alarming.

    It was good being able to use the same exact benchmark software to objectively measure hundreds of machines, but I am now rethinking our benchmarking approach. I greatly value consistency and comparability of results, and the goal remains arriving at results that give a good idea of overall perceived performance, but we can't have discrepancies like what I witnessed.

    Posted by conradb212 at 8:47 PM

    May 6, 2011

    Conversation with Ambarella's Chris Day about the state of still/video imaging in mobile computing devices

    In a recent blog entry I wrote about the generally low quality of cameras built into rugged mobile computers compared to the very rapidly advancing state-of-the-art in miniaturized imaging technology. It doesn't seem to make sense that high quality, costly tools for important jobs should be saddled with imaging hardware that ranges from only marginally acceptable to quite useless. Still and video cameras are now in tens of millions of smartphones and many of them now can take very passable high res still pictures as well as excellent video. I would expect no less from vertical market mobile computing hardware.

    Why is that important?

    Because the ability to visually document work, problems, situations or details is increasingly becoming part of the job, a part that can dramatically enhance productivity, timeliness and precision, as well as enable quick problem solving by consulting with home offices, etc., and it also helps create documentation trails. Add technologies such as geo-tagging and mapping, and the presence of high quality hybrid imaging functionality has an obvious and direct impact on return on investment as well as total cost of ownership. However, that only applies if the computer's still and video capturing capabilities are at the same high quality and performance level as the computer itself.

    Over the past several months I have asked several of my contacts in the mobile computing world why the cameras aren't any better, especially since many of them highlight those cameras as productivity-enhancing new features. Which they can be, but generally are not, or not yet. The cameras are slow, produce unacceptable pictures (low resolution, artifacts, color, compression, sharpness, large speed delays, interface), and video is generally almost useless (very low resolution, very low frame rate, etc.). I did not receive any compelling answers, just tacit agreement and somewhat vague references to space and cost considerations.

    So I decided to seek opinions from people at the forefront of today's miniaturized image processing solutions and get their side of the story. Molly McCarthy of Valley PR was kind enough to arrange a call with Chris Day, who is Vice President, Marketing and Business Development at Ambarella and has one of those very cool British accents.

    Why did I seek out Ambarella? Because when we took apart a video scuba diving mask I had been testing, I found Ambarella chips inside. The product was the Liquid Image Scuba Series HD mask that has a high definition still/video camera built right into the mask. It can shoot 5-megapixel still pictures and also 720p HD video (see our review). The mask including the camera costs less than US$250 and it records on a microSD card. We also reviewed another tiny sports camera that includes Ambarella technology (the ContourHD), and that one can do full 1080p HD video.

    What is Ambarella? It is a Silicon Valley company that was formed in 2004 to be a technology leader in low power, high definition video compression and image processing semiconductors. Chris explained that their main thrust is H.264 video compression, a technology that generates very good video at file sizes much smaller than conventional formats. Their largest market is what's called consumer hybrid cameras, the rapidly expanding segment of small cameras that can do both high quality, high resolution still images as well as superb high definition video. Ambarella is probably the leader in that area, and also the first to truly merge high-res video and still imaging.

    Ambarella's hottest market right now is sports cameras, the kind that generate incredible HD video of skiing, skydiving, car racing, and all sorts of extreme sports (including, of course, scuba diving). They also do cameras for security and surveillance where the days of the grainy b&w low-res video often shown in "the world's dumbest criminals" type of TV shows is rapidly coming to an end. Ambarella also supplies other markets that rely on high compression but also high quality in their sophisticated imaging and forecasting systems.

    About 400 people work for Ambarella these days, 100 of them at the Silicon Valley headquarters. For the most part, Ambarella makes chips, but they are also getting closer to providing full products, and already offer hardware/software development platforms.

    I told Chris of my puzzlement over the primitive state of cameras built into most current mobile computers, especially considering that the professionals using those expensive high-quality computers could definitely use reliable, high-res cameras built into their equipment. Chris said that Ambarella did have discussions with several notebook manufacturers three to four years ago, but nothing ever came of it, primarily for cost reasons.

    Now it must be understood that a good part of Ambarella's value-added consists of the chips that do very fast, very good video compression, and general purposes processors can do some of that, so perhaps consumer notebook makers simply didn't see the need for the extra speed and quality when most notebook users don't ask for more than basic webcam functionality.

    Notebooks are one thing, of course, and tablets and smartphones another. Also to be considered is the fact that there are really two types of cameras used: vidcams for video conferences (increasingly referred to as "front-facing" cameras), and the much higher resolution documentation cameras (generally called "rear-facing") used like regular digital cameras. Most better smartphones and tablets now have two cameras, one for each purpose.

    To that extent, Ambarella created their iOne smart camera solution that brings full HD camera and multimedia capabilities to Android-based devices. The iOne's SoC (System on Chip) supports live video streaming, WiFi upload of video clips, and full HD telepresence applications. It also has multi-format video decoding for playback of Internet-based video content up to 1080p60 resolution (i.e. better than HD TV). Chris felt that sooner or later one of the media tablet makers would truly differentiate itself with a superior built-in camera.

    Ambarella also offers full development platforms for digital video/still imaging that contain the necessary tools, software, hardware and documentation to develop a hybrid DV/DSC camera functionality (see Ambarella consumer hybrid camera solutions here).

    The bottom line, Chris Day said, is that "it is possible to have a mobile computing device that is also a world-class camera." We're just not seeing them yet. I am convinced that the first professional mobile computing product that offers the still/video recording capability of an inexpensive consumer camera will have a definite strategic and marketing advantage.

    But what about the size and cost? As is, there are any number of imaging modules for those handy smartphones that are getting better all the time. They are tiny and inexpensive and light years ahead of what we now see in actual vertical market mobile handhelds and tablets.

    A step up are the imaging modules that go into standard digital cameras. Those are larger and more complex, but judging by the tiny size of today's consumer point & shoot cameras that often offer 14 megapixel and 1080p video, those electronics should also easily fit into many mobile computing devices. They cost more, of course, but given the fact that many consumer cameras are now under US$100, it should be possible. Consider one product that uses Ambarella technology, the Sony Bloggie Touch. It can do 12.8mp stills, 1080p video, has 8GB of memory and a 3-inch touch LCD, yet it's hardly thicker than half an inch and costs under US$150. The guts of this in a rugged tablet or handheld would make an extremely attractive combination.

    So the experts have spoken. It's doable. And it wouldn't even cost that much.

    Video/imaging integrated into cellphones has changed the world. A lot of reporting now originates from smartphones before CNN ever gets there. And there's already talk that smartphones may essentially replace the conventional low-end camera market. The technology is there.

    State-of-the-art DV/DSC Video/imaging could bring great value-added to rugged mobile computing hardware. Being able to document work, situations, conditions can be invaluable and truly open many new possibilities to get jobs done better and faster. But the pictures must be good, and users must be able to rely on the camera. Current camera modules cannot do that. HD video, likewise, could change everything. And it is truly lightyears ahead of the slow, grainy QVGA and VGA videos that most current computer cameras are limited to.

    Posted by conradb212 at 2:16 AM

    April 26, 2011

    Is the race for tablet supremacy already over? Many developers think so.

    Who could forget Microsoft CEO Steve Ballmer stomping around the stage and yelling "developers, developers, developers!" at conferences in the mid-2000s (see Balmer's developers on YouTube)? Well, according to the Appcelerator/IDC Mobile Developer Report, April 2011, the developers have spoken and the news isn't at all good for Microsoft, and not even that good for Android.

    What Appcelerator and IDC did was survey a total of 2,760 Appcelerator developers on their perceptions regarding mobile OS platforms, feature priorities and development plans. The survey essentially showed that while Android smartphones have passed the iPhone in terms of sales and market share, developer interest in both Android smartphone and tablet apps has stalled and reversed, with both being behind interest in iPhone and iPad development. According to the report, this is due to "an increase in developer frustration with Android. Nearly two-thirds (63%) said device fragmentation in Android poses the biggest risk to Android, followed by weak initial traction in tablets (30%) and multiple Android app stores (28%)."

    I think that's worth a lot of thought. Despite frustration with Microsoft, Apple's market share in computers was in the low single digits for many years and not even the Vista debacle and Apple's great momentum in iPhones and such managed to lift the Mac OS to more than 10% or so (in Switzerland it's highest, with 17.6% according to StatCounter Global Stats, Feb. 2011). Yet, the situation is completely different with media tablets where the Apple iPad continues to be virtually unchallenged a year after its initial release. Apple still has a commanding market share. In 2010, it was 83.9% according to Gartner, which predicts that Apple will still hold an almost 50% share in 2015, still beating Android by several percentage points.

    Such market dominance of a single company is almost unheard of, and certainly not in a market that has a good percentage of customers who are against the company on principle, as is the case with Apple. Then again, there's precedent: No one else managed to come close to Apple in MP3 players either. Even though MP3 players can be considered commodities and hardly cutting edge hardware, the iPod continues to reign supreme with a ridiculously commanding market share whereas Microsoft got absolutely nowhere with its Zune.

    But can this happen again with tablets? On the surface it seems impossible. Hardware is a commodity, and there are certainly more than enough critics of Apple's very controlled approach to the whole development and sales process. But here we are, a year later and there just isn't anything else out there to challenge Apple. Why is that?

    There are several reasons. First, Apple not only created a great product with superior battery life (a huge factor), but it also really aced the pricing. After having been known as a premium-price player for almost all of its history, the iPad is pretty much the low price leader. Sure, you can pick up an Android tablet on eBay for a hundred bucks, but those tablets are so poorly made and of so little use that they have actually hurt the Android cause rather than helped it. And like it or not, but the Apple app store simply guarantees a good user experience. Knowing that there won't be inappropriate content, viruses, frauds and cons is invaluable. And having so many more good apps than anyone else is invaluable as well. As is having one source, and not having to figure out what store to have to go to.

    But back to pricing: Motorola and others learned quickly that pricing any media tablet higher than the iPad was simply out of the question. But pricing it lower is also pretty much out of the question if there is to be any profit potential at all. Remember that unlike Apple, any other hardware vendor does not have the built-in income from their own dominant app store.

    So what can the rest of the industry do? Make better tablets, for one thing. As is, the surveys says that "while 71% of developers are very interested in Android as a tablet OS, only 52% are very interested in one of the leading Android tablet devices today." No surprise here; everyone else has essentially been copying Apple's look and features: Capacitive multi-touch? Check. Slender, glossy, black slate? Check. Nice icons, zooming, pinching, panning, etc.? Check. 3G? Check. Long battery life? Check (mostly). Simply beating Apple in a spec here or there won't make a difference; that's like Hyundai claiming they beat Mercedes Benz or BMW in this stat or that.

    I am fairly sure Android will be doing well on tablets anyway, but as of now, the issues the platform faces are very real. According to the Appcelerator/IDC survey, by far the biggest concern is Android's fragmentation. Only 22% of the polled developers feel that the problem is that iOS is simply better, but almost two third cite fragmentation. Too many tablets, too many versions of Android, too much needed customization. In that sense, it's a bit like the difference between developing for a game console where the hardware and software are constant (iPad), and developing for the PC where there are a million processor/OS/BIOS/storage/display permutations (Android tablets).

    But what of the Microsoft factor? Microsoft simply has got to know that a leading presence in mobile is mandatory if the company is to remain relevant as it's historically been relevant (i.e. being #1 in every market it enters). But well into year 2 of the tablet era, Microsoft remains in the same deer-caught-in- aheadlight gridlock over what to do. The issue is always the same: how to tie a non-PC platform into the PC-based Windows platform. Windows CE/Windows Mobile never really succeeded the way it could have because Microsoft intentionally dumbed down those platforms, fearing they might compete with the almighty Windows proper. In tablets, that attitude just won't do. Anything that looks like it's really just an adapted version of mouse Windows is not going to work. Not now, not ever. If Microsoft does not get over that mental block, Microsoft will not be a factor in tablets.

    As is, the polled developers already feel, by roughly a 2/3 majority, that no one can catch up to Apple or Android. The developers-developers-developers have spoken here, and so Microsoft finds itself in the odd position of having to hope that a hardware partner will pull a rabbit out of the hat. That has never worked too well for them in the past, with the sole exception of the original IBM PC deal. And even that meager assessment by the developers is probably based on the respectable early showing of Windows Phone 7. Microsoft's still amorphous tablet effort may be an even longer shot.

    Then there's the next issue. Are we perhaps entering an era where people abandon the web as we know it and simply turn to apps? It seems unthinkable and the web certainly won't go away any time soon, but let's face it, the web has become a big pain in many respects. Websites are jam-packed full of ads and commercial messages. More websites than not are simply nearly content-free decoys to lure AdSense and other ad click traffic. There's danger waiting everywhere. Often, the web today feels like running the gauntlet in a seedy neighborhood full of panhandlers and worse.

    Now compare that with the structure and safety of apps. They do exactly what you want them to do. They've been tested and certified. They are clean. And you are in command. That's why a growing number of companies now offer their own apps in addition to their websites. Yes, it's a bit ironic that we may all return to the gated communities that we had in the past (remember AOL and CompuServe?), but that's the way things seem to go. Already, developers interested in building apps outnumber those interested in the mobile web by 5:1.

    Does that mean the race is run? Not at all. Momentum can change very quickly. While it's unlikely that Apple or Android-based tablets will crash and burn, you never know if Microsoft or perhaps even HP with the WebOS will come out with something so awesome that the momentum shifts. Decades ago IBM found that they could not profitably compete in the very PC market they had created. Netscape was defeated by Internet Explorer, which initially had looked like a woefully inadequate competitor. Unbeatable Palm lost its mojo and vanished. It can all happen.

    As is, from the vantage point of a product reviewer and publisher, I am surprised by a number of things.

    First, I wonder why everyone simply copies Apple instead of taking advantage of Apple's weak spots. Yes, the almighty iPad has some weak sides, and none worse than its ridiculously glossy, ridiculously smudge-prone display. Any major tablet vendor who comes out with a product that does not turn into a mirror outdoors has an instant, massive advantage and selling point.

    Then there's the vaunted leverage. "Leverage" has been Microsoft's main argument for decades, and it goes something like this: since 90% of all computers use WIndows and everyone knows how to use Windows and there are so many programmers who know Windows development tools and languages it only makes sense to "leverage" that investment into other platforms. That was always the argument for Windows CE/Windows Mobile, and in the vertical markets, which is still almost 100% Windows Mobile, it worked. Now Google doesn't have any leverage, and Apple doesn't have much. In fact, it's amazing that Apple managed to build so much around arguably its weakest point, iTunes.

    Point is, if Microsoft came out with some way to truly leverage its Windows position into tablets and smartphones, things could change in a hurry. It's not clear how that could happen, but there simply has got to be a compelling way to truly extend the commanding position Microsoft has on the desktop (and on the lap) to tablets and smartphones. And no matter how positively surprised I was with Windows Phone 7, we're not talking just a Zune interface and automatic updates from Facebook and Twitter.

    And where does that leave HP and RIM? HP recently made noises about offering WebOS on a whole range of devices. The HP TouchPad will run WebOS, WebOS has had mostly good reviews when Palm introduced it for its smartphones a couple of years ago, and HP certainly has deep enough pockets to make an impact. Palm/HP never sent us a Palm Pre or Pixi for review and I wasn't about to sign up for a 2-year telco contract just to review a Pre in detail, but from what I can tell, WebOS excels at something that is just a pain on the iPad--multitasking. The lack of useful multitasking is the one thing that keeps me from using my iPad for more than I already use it for, and the sole reason why I still take my MacBook Pro on business trips.

    RIM, they have more of a problem. For RIM, the question really is whether lightening can strike twice. RIM rose to prominence based on one single concept, that of providing secure, totally spam-free email in a pager-sized device. That worked for many years, but RIM struggled with adding some pizzazz to their BlackBerry devices, and going it alone on tablets seems undoable. In the Appcelerator/IDC survey, developers were somewhat excited over RIM's announcement that their PlayBook tablet would support Android apps. That's really no more exciting than Apple's claim that you can run Windows software on a Mac back when no one wanted a Mac. That said, it's almost impossible to figure out what does and does not make business sense, and so we may see some seemingly weird niche products.

    As far as the situation at hand goes, the developers-developers-developers have spoken. For now. Developers go where the money is, and even massive incentives go only so far against the lure of a successful app store and tens of millions of tablets sold. An awful lot is at stake here and it's a war, one that seems pretty clear right now (Apple strong, Android gaining), but also one where things can still change in a hurry.


    Posted by conradb212 at 9:03 PM

    April 22, 2011

    Microsoft....

    So I'm getting to the next machine in the review queue, charge it, then start it up, just to get nagged by Windows to activate the OS. Would I like to do that online, right now? Huh? Huh? I didn't think that was going to be possible since the machine didn't know the password to my WiFi network yet. But Windows wanted to try anyway and so I let it. Of course, it didn't get anywhere.

    So then I am in Windows 7, but there's this nasty message at the bottom right that says, "This copy of Windows is not genuine." Well, that's bad news as the machine is a prototype from a well-respected rugged computing manufacturer.

    I let Windows get access to my WiFi and tried the activation again. No go. I get a ominous message that says "You may be a victim of software counterfeiting." Oh, oh. So I accepted the option to "Go online and resolve now."

    Well, Windows then said that "Windows validation detected and repaired an activation exploit (used to prevent Windows Vista built-in licensing from operating properly)" and that I had to activate Windows in order to "complete the repair process and be able to use the full functionality of Windows Vista."

    Dang, and there I thought I was on Windows 7 on this brand-spanking new machine.

    Windows offered: "Not to worry, we can help you with that."

    The help consisted of offering me to buy genuine Windows, the professional version for just US$149.

    Now, first, I wasn't on Windows Vista. Second, I didn't have a non-genuine version of Windows. And third, I most certainly wasn't going to pay $149 to upgrade my brand-new Windows 7 to Windows 7.

    So I rebooted, and then rebooted again. Now Windows decided that my software was genuine and just wanted to activate it. And this time it worked.

    Go figure. And go figure how Microsoft can be in business.

    Posted by conradb212 at 3:13 AM

    March 29, 2011

    Why are cameras in mobile computers not any better?

    When I founded the original Digital Camera Magazine in 1997, almost no one thought that digital photography would ever seriously challenge film. At best, digital cameras were thought to become novelties or peripherals for computers. Yet, just a decade later, digital imaging had surpassed film and, in one of the quickest major technology upheavals, quickly made film irrelevant. As a result, digital cameras, which initially had carried a steep price premium, became more and more affordable. Today you can get a very good and incredibly compact 14-megapixel camera for less than US$100. In essence, digital imaging technology has become commoditized.

    Which makes one wonder wonder why cameras integrated into mobile computing equipment aren't any better.

    It's sad but true: cameras built into mobile computers are simply not very good. Some are getting better, but virtually none are within a lightyear of even the most basic dedicated digital camera. And, worse for those why rely on top quality tools for the job, cameras in consumer products such as smartphones and media tablets are generally much better than what is used in vertical market equipment. That is hard to explain.

    Why is it important to have a good camera in a mobile computer? Because mobile computers are expensive tools for important jobs. Image capture is quickly becoming a must-have feature in the field. Field workers must document all sorts of things out there, like accidents, conditions, extraordinary events, repair status, etc., etc. And those images must be good enough to be of value.

    As is, most cameras integrated into mobile computers cannot do that. The cameras are low res (hardly ever more than 3-megapixel), slow (often unacceptably slow), basic (few come close to the features even the cheapest dedicated camera has), and thus simply cannot do the job they're supposed to do. There are probably all sorts of explanations as to why that is, but I just can't buy them. If a cheap, tiny consumer camera can take award-winning pictures, the guts of such a camera can and should easily fit into a much larger mobile computer. Why this isn't happening is beyond me, but it just isn't.

    This stunning lack of cross-fertilization between two major technologies actually goes both ways. Cameras have progressed immeasurably over the past decade, yet to this day, digital cameras come with the same tiny 30MB or so of internal memory they always have. You can buy a generic MP3 player with 8, 16, 32 or even 64GB of storage for a few bucks, but even the most advanced consumer digicams have essentially no internal storage. Which is always a REAL pain when your card gets full or you forget to put one in. And let's not even talk about compatibility. In the camera world, every company has their own standard and almost nothing is ever compatible.

    That really needs to change. Customers who pay $2,000 or more for a rugged mobile computer should be able to take superb pictures with it, and shoot HD video, just as you can with a little $100 camera. There is simply no excuse, none, to put sluggish, insufficient imaging technology into expensive computer equipment. It cannot be a cost issue either; missing ONE important shot because a field computer's camera is so unwieldy and incapable can cost more than the entire device.

    So let's get with it, mobile computing and camera industries! Camera guys: You need some real storage in your product, and no, going from 30 to 100MB won't do. And give some thought about compatibility. Computer guys: Do demand and insist that the camera guts inside your wonderfully competent mobile computing gear is not an embarrassment. It should work at least as well as that brand name $79 camera you can pick up at Walmart. And that includes good video and a real flash!

    So there.

    Posted by conradb212 at 6:21 PM

    February 9, 2011

    How we get news

    A big part of the work here at RuggedPCReview.com is getting and spreading the news on what's going on in the rugged and mobile computing world. How do we do that? And how can manufacturers help get the news out?

    In the past, it was pretty simple. We went to trade shows to see what all was new, loaded up on glossy brochures, attended press conferences, and left behind a bushel of business cards so we'd be in the rolodex of everyone who mattered in the rugged computing industry. That pretty much ensured a steady supply of news via mailed press kits and such, plenty enough to fill a print magazine every other month back in the day when we published Pen Computing Magazine. For a while after that era, it was a hybrid thing, with part of the news coming the old-fashioned way, and part gleaned from websites.

    It's all changed now. We still go to the occasional trade show, and they are always fun and helpful. And you get to actually see the people there. But shows are also expensive and a time waster, what with all the traveling, cabs, airports, hotels, waiting in line, and then the rush at the show itself. So for the most part, trade shows are a (bitter)sweet memory now.

    Today, news comes from numerous sources, through numerous channels, and I get it all sitting in front of the big display of my iMac27 with dozens of windows open. That, for me, is news central, and here's where it all comes from:

    BusinessWire PressPass -- a daily email with headline news on the topics I subscribe to. The cool thing is that they show the company logo next to the headline. That makes it easy to very rapidly scroll down the (looong) email and stop when my eye catches a familiar logo. Seems like a little thing, but in this day and age of massive information overflow, we need all the filters and help we can get.

    PR Newswire for Journalists -- these are individual emails that include a paragraph that describes the news, and also links directly to a full press release. These are quite useful.

    Marketwire Newsletter -- another daily email with items of interest for me, but this one is all text, and the headline is accompanied by paragraph. That increases the chance that I can search for keywords like "rugged" and catch things of interest. But it's also tedious to sift through a hundred paragraphs of news.

    Google alerts -- yes, Google does it again. I have Google alerts set up for pretty much all the companies I follow, and also some on beats I cover. They are typical Google minimalist, and, like Google searches, they tend to include stuff I really don't need, but it's a great way to keep track of all mentions of a topic or term. Very useful.

    PR folks and agencies -- yes, they still fill a purpose. I get emails from dozens of agencies and individuals. Some are very useful and I couldn't do without them. Others seem to simply pad their mailing lists. Overall, a good PR agency contact is invaluable. And good PR people assigned to the same account for a long time? Gold.

    Websites -- company websites are still the definite, authoritative source of information. Problem is, many are falling behind the news. Some sites only seem to get updated when they have a web designer re-do their site. Then it eventually falls into near disrepair. That's the exception, of course, but even large companies with good sites often issue press releases without having the info up on their own sites when the news breaks. That is frustrating.

    Social media -- honestly, far less useful than what the in-crowd wants you to think. I just don't have the time to be a "fan" of every company I need to cover, be that on Facebook or Twitter or what all.

    Communities, Web 2.0, etc. -- the first time I saw a company "community" site was cool. I think it was the Sanda agency that did it for Trimble. It was well done, fun, informative. And the overall recipe has been copied by many others. This can be a nice way to foster a community spirit between companies and users, sort of like an ongoing user conference. But it's far too time-consuming for us media guys. We just don't have the time to stop by for a chat and looking around. So for news, not good. Overall, nice concept and useful.

    Pounding the street -- yes, we still do that. Not really the street, of course, but the web. That's because we inevitably miss news and things fall through the cracks. So periodically I go check websites to see if something happened that we missed. But we can't do that often enough for this approach to do anything but fill gaps.

    Too much news? Not enough news? -- Overall, of course, the world's drowning in news. And sifting through all that news takes a major chunk of my time every day. That, and then converting worthwhile news into our own, very targeted news items, product pages, and, eventually, the detailed reviews RuggedPCReview is known for.

    However, there seems very little consensus on how much news is right.

    There are companies that announce something practically every day, and that's often too much of a good thing. I am also not fond of news that really isn't news at all, but just a way to get in the news.

    On the other hand, there are companies that seem to avoid news like the plague. I look at their websites and find a news item from last summer, then one from the winter before that, and that's that. Not good enough. Every company that sells stuff has news, and that news needs to get out. It doesn't always have to be a new product announcement; news about updates, upgrades, partnerships, contract wins, successful deployments, tech primers, white papers; they are all news.

    Because, after all, news is about being in the news, being on top of the page, getting attention. That sort of exposure makes buyers think, "Hmmm... I just read about that company the other day. Let me look them up."

    And that's what it's all about.

    Posted by conradb212 at 4:12 PM

    January 18, 2011

    Bye-bye PXA processors? Probably not just yet.

    There was a time, around the year 2000, when Microsoft essentially decreed that Pocket PCs were to run Intel XScale processors. That was a big change, and a rude awakening for some of the Windows CE hardware vendors who had been promised that Windows CE was going to be a multi processor architecture platform. But Intel XScale it was, and the Intel PXA became the de-facto standard processor for virtually all vertical market handhelds for a decade.

    So product specs for all those handhelds of that era weren't very exciting. They either had an Intel PXA255 or a PXA270 processor, with slight variations in clock speed. Considering the demise of the once high-flying PDA industry in favor of telco-controlled smartphones, those vertical market handhelds were a rather successful niche, with the occasional massive sale to parcel carriers, field service organizations, postal services, and so on. However, despite the virtual monopoly of the PXA processors, those industrial handhelds were not a lucrative enough market for Intel to remain interested. So in 2006, Intel sold the PXA business to Marvell for a modest US$600 million.

    Marvell, a silicone solutions company intent on cracking the emerging smartphone market, initially scrambled to find someone to make the chips for them. They then quickly launched the PXA3xx series of application processors, including the high end 806MHz PXA320. When we tested the first handheld with the new PXA320 chip (the TDS Nomad), we were blown away by its speed and responsiveness.

    However, Marvell apparently did not have the reach and marketing power of Intel. Sure, Marvell PXA270 and even the older PXA255 chips continued to power numerous handhelds, but the powerful new PXA3xx chips had trouble gaining traction. There was a new design win here and there, but we also started seeing defections. And those who stayed with Marvell often chose the older PXA270 chip over the newer and more powerful PXA310 or 320.

    Of recent releases, Motorola stayed with Marvell for their new MC75A0 (PXA320) and MC55A0 (PXA320), but used a Qualcomm MSM processor for their ES400 enterprise digital assistant. Psion Teklogix chose a Texas Instruments OMAP3 processor for their Omnii XT10, GD-Itronix an ARM Cortex-A8 for their GD300, Datalogic did stay with Marvel for their new Elf (PXA310) and Falcon X3 (PXA310) handhelds but combined them with ARM Cortex co-processors. DAP Technologies stayed with Marvell with their new M2000 (PXA270) and M4000 (PXA270)Series. Getac stayed with Marvell for their PS236 (PXA310) handheld, but not their PS535F that uses a Samsung S3C2450. And then came the most recent blow for Marvell when Intermec based its new line of likely rather high-volume 70 Series handhelds on the TI OMAP 3530.

    The situation doesn't appear to be critical for Marvell yet, as the majority of handhelds out there continue to run on its processors, and there have been some good recent design wins for the PXA310 and PXA320. But the PXA3xx series is now also already over four years old, an eternity in processor terms. It's also not quite clear how Marvell's ARMADA family of application processors relates to the PXA chips. Marvell recently explained to me how ARMADA processors target various markets ranging from consumer display devices like eReaders and tablets to high-end HD TVs, but the name AMADA never appears in vertical market handhelds, and while the PXA 3xx processors are listed with the ARMADA chips, there also seems to be an ARMADA 300 Series with 300/310 chips. Confusing? I'd say so.

    A little wordplay anecdote here: two or three years ago, Marvell introduced its own "Shiva" CPU technology and announced it'd be used in upcoming SoC (system-on-chip) products. The PXA processors were then considered part of the Shiva family. So where's the word play? Well, turns out a year before, the Marvel Comix had released a comic book with armored Shiva robots that could not be defeated the same way twice. Apparently Marvel Shiva and Marvell Shiva was too close for comfort, and so the Shiva name is gone from Marvell.

    Anyway, no, I don't think the Marvell PXA chips are going away anytime soon, but unless Marvell has some plans up its sleeves that were not aware of, they also don't seem to be going anywhere. Which, come to think of it, is pretty much where vertical market handhelds are in general, sort of in a holding patterns until it becomes clear whether Microsoft can be counted on to provide a true next generation mobile operating platform, or not. And whether the fundamental changes in user interface expectation brought upon by the iPhone/iPad and Android smartphones will lead to pressure for similar functionality and ease-of-use in vertical market devices, or not.

    Posted by conradb212 at 10:25 PM

    January 7, 2011

    Microsoft announces.... nothing. Google follows suit.

    Well, the much anticipated Las Vegas CES is shedding no light on how the industry will react to Apple's monster tablet home run. Yes, there were some tablets here and there, but really nothing that we didn't know already, and certainly nothing earth-shattering.

    Microsoft, stunningly, showed nothing. Nada. No product, no strategy, no plan. The whole situation was remarkably similar to a time several years ago when erstwhile handheld champion Palm was in the ropes and Microsoft had an opening a mile wide to finally get some traction with Windows CE. What did they do then? Nothing. Well, they came out with Windows Mobile 2003 for Pocket PC 2nd Edition. But even that was better than simply nothing at all. And back then there was nowhere near as much at stake.

    If there is one single saving grace in this stunning inactivity, it's that Google, too, missed a giant opportunity to pull it all together and present to the world -- voila and ta-da -- the definite Android OS for tablets, the one that will do battle with Apple, the one that will make Microsoft irrelevant in tablets forever after. Didn't do it.

    So those who stuck by Microsoft will now have tablets that really don't work very different from the old Tablet PCs. And those who meekly tried Android or something else missed a golden opportunity to put themselves on the map.

    This is as close to forfeiting a game as it gets. By the time Microsoft may finally have something, Apple will have many tens of millions of iPads in the field. And after the virtual Android no-show at CES, the notion that Google seems unable to provide a cohesive tablet platform may get stronger.

    So 2010 was the year of the tablet, for Apple, and 2011 will again be the year of the tablet, and no one's playing other than Apple. No one, I should say, of the big guys. There have been some nice new products. Motion Computing's new CL900 tablet is a thing of beauty and we really liked the little Samsung Galaxy Tab we had here for a few weeks.

    But overall, Microsoft's apparent inability to figure out what to do in tablets and Google's ongoing spreading itself too thin is eerily reminiscent of CE devices from the likes of HP, Compaq, IBM, LG, NEC, Casio, Philips and others combined fail to gets as much as 25% handheld marketshare against the little Palms. Eventually, of course, Palm defeated itself, but that's not likely going to happen to Apple.

    So the tablet crystal ball remains as milky as ever.

    Posted by conradb212 at 11:25 PM

    January 2, 2011

    Motorola, and the corporation names, corporation games thing

    So on January 4, 2011, Motorola will complete its separation into two companies. The way it actually works is that what used to be Motorola will separate Motorola Mobility Holdings, or Motorola Mobility for short, from Motorola proper, and Motorola will then change its name to Motorola Solutions. So technically it looks more like Motorola jettisoned their phone business to concentrate on the much more stable and predictable vertical market offerings developed and sold via Motorola Solutions. From a stockholder's perspective, they'll get one share of Motorola Mobility for every eight shares of old Motorola stock. The old Motorola stock will then undergo a reverse 1-for-7 stock split so that seven shares of old Motorola stock becomes one share in Motorola with its new Motorola Solutions name (see how it works).

    Sure reminds of the lines in that old Grace Slick We-Built-This-City song: "Someone always playing corporation games, who cares they're always changing corporation names."

    While the spun off cellphone business will have about the same number of employees as the solutions business (both about 20,000), annual revenue of the cellphone side is expected to be US$11-12 billion and on the solutions side about 8-9 billion. However, the cellphone business is exceedingly unpredictable compared to the much more linear solutions side. For example, who'd have thought that cellphone world leader Nokia would completely miss the smartphone wave? Who could have predicted the iPhone? And wasn't Motorola itself on top of the world with its RAZR (over 120 million sold), and then practically fell off the map when the follow-ups didn't catch on? And who could have predicted that the Droid would catch on as it did. It's a wild ride there in cellphones, feast or famine.

    Things are much different on the solutions side. Everyone needs "solutions," and solutions helped IBM and HP quietly not only remain relevant, but become bigger than ever despite diminished emphasis on hardware. IBM ditched its PC business (to Lenovo) and printer business (spinning off Lexmark), and while HP is huge in hardware, buying EDS (Electronic Data Systems) also made it one of the largest solutions providers, which now contribute a third of HP's revenue. Compared to those two giants, each with revenues over US$100 billion, Motorola Solutions will be small, but the business model sure looks promising.

    What is Motorola Solutions? Company officials have always struggled with communicating that clearly. In essence, they leverage established lines of two-way radios, network equipment, scanners, and handheld computers into solutions for just about any vertical market. The scanners and handheld computers, of course, come from Symbol, which Motorola acquired in 2007. In a sense, with the former Motorola now being Motorola Solutions, and Symbol being a large part of Motorola Solutions, it almost looks like Motorola merged into Symbol, though I am not sure what part of the Motorola Solutions revenue comes from Symbol and what part from the two-way radio and related equipment side. I am also not sure who packages and manages those solutions that rely on Symbol's scanners and handhelds, and the radios, and whether Symbol will be just a hardware shop or be more involved in the solutions process.

    Overall, one cannot help but wonder what Motorola had in mind with Symbol and its very considerable brand equity. After the acquisition there were a good two years where some Symbol products continued to have the Symbol name and logo before gradually getting the corporate Motorola logo. In the new Motorola Solutions, Symbol remains by far the most prominent brand, and I really do wonder what it all looks like from the inside.

    One almost wonders why they didn't just go back to the Symbol name. There's precedence for such a move: Motorola Solution's biggest competitor in mobile hardware is Intermec, and Intermec once was just a subsidiary of Unova, and not a major one at that (though they swallowed Norand, just as Symbol swallowed Telxon). Yet, under the dynamic leadership of former CEO Larry Brady, Intermec established itself as a successful and driving force in the mobile/wireless industrial market, to the extent where in 2006, Unova changed its name to Intermec.

    Oh, and then there is the weird thing with the operating systems: Motorola Mobility (strange choice of name, actually, considering that the actual mobile computers went with the other side) totally depends on Google's Android now, whereas all of Symbol's handhelds use Windows Mobile. Yet, given Windows Mobile's rather tenuous position and uncertain outlook, Symbol/Motorola Solutions simply has to have much more than a passing interest in Android itself. But the Android expertise is now in the other Motorola company. Go figure. And that's before the looming possibility that the Oracle/Google lawsuit over Android may put a monkey wrench in the works, or that Samsung or HTC take over the Android phone business.

    Yes, they're always changing corporation names, and at times it's hard not to see it all as corporation games. In our fast-moving world where companies grow and buy each other, those games and struggles have become the norm, and sometimes one really wonders if all the overhead was worth it.

    I was reminded of that while following the course of action of another mobile computing conglomerate over the past three years or so. What happened there was that Roper Industries, a very diversified almost US$2 billion company added three mobile computer companies to its roster, those being long-established Canadian DAP Technologies, start-up Black Diamond, and JLT Mobile Computers. The three companies were put together under the "Roper Mobile Technology" name, with DAP, Black Diamond, and Duros (the former JLT models) being its brands. Roper Mobile Technology was then renamed to RMT, Inc, with a nice and modern logo. That seemed to make much sense, but then the whole effort was shelved, with DAP Technologies absorbing the Duros lineup and renaming everything, retiring both their old "MicroFlex" brand name as well as the impressive-sounding "Duros" in the process (and also the somewhat contrived-sounding "Kinysis" name). Black Diamond is once again on its own as a subsidiary of Roper Industries. This probably all makes sense, but from the outside it looks confusing and like a few year's worth of lost opportunity to establish a force in the mobile/rugged market.

    No one has a crystal ball, and every decision is (hopefully) the result of careful consideration, but sometimes it's hard to figure out why things are being done a certain way when there appear to have been much more logical courses of action.

    Posted by conradb212 at 4:47 PM

    December 22, 2010

    "10 tablets that never quite took off"

    This morning, one of my longterm PR contacts brought to my attention a feature entitled "10 tablets that never quite took off." It was published by itWorldCanada, which is part of Computerworld. Now Computerworld is one of the world's leading resources of excellent IT reporting, and has been for decades (I used to contribute it in a former life as a corporate CIO), but the "slideshow" was disappointing and missed the point by listing some older tablets and mocking them.

    Unfortunately, we're seeing a lot of this sort of stuff in the media now. Most younger editors seem to believe that Microsoft invented the Tablet PC in 2001 when, of course, tablets were around a good decade before that. Older editors who did not specialize on rugged vertical market hardware often have a distorted memory of what those pioneering efforts meant. While it's undoubtedly true that earlier efforts at commercializing tablets for the consumer market were met with little success, those tablets did succeed in many important vertical and industrial markets. Mocking older, pioneering technology for not being like the iPad is a bit like mocking the military Humvee for failing to succeed as a suburban SUV; different purpose, different time, different technology.

    The slideshow presents some interesting benchmark products that were ahead of their time (such as tablets from Motion, Acer, ViewSonic, Xplore, etc.), but the commentary seems oddly uninformed/flippant for a respected entity like ITWorld. They mentioned "a firm named Wacam" and diss it for not being multi-touch. well, first, it's Wacom, not Wacam, and second, Wacom's electromagnetic digitizer technology has successfully been used for about 20 years (and remains part of Wacom's G6 input technology that combines capacitive multi-touch with an active digitizer).

    The whole slide show seems ill-informed and condescending, sort of like a cheap potshot that disqualifies earlier, pioneering efforts as nothing but technological pratfalls. That is hardly true as what Apple eventually came up with, and what everyone else is trying to emulate now, stands on the shoulders of those pioneering efforts. If anyone is to blame for a lack of consumer market success it's Microsoft which, in its insistence on "Windows Everywhere!," never made more than a token effort to provide an OS suitable for tablets. In the light of this, the relative success of a ruggedized, special purposes tablet computers for industrial markets is even more impressive. A publication like IT World Canada ought to know and appreciate this.

    Posted by conradb212 at 6:35 PM

    December 16, 2010

    The tablet wars: background and outlook

    This whole tablet thing is really interesting.

    Despite getting soundly trashed by a good number of industry experts when the iPad was first announced by Steve Jobs on January 27, 2010, Apple ended up selling about ten million of them in 2010, and the same experts now predict that a lot more will be sold in the coming years. Everyone is scrambling to also have a tablet. Tablets are hot, tablets will demolish the netbook market, tablets will eat into notebook sales, Microsoft will gag and wither over having blown it with tablets, and so on and so on.

    So let's take a look at what's really happening. First, tablets are not new. I often see references in the tech press on how Microsoft invented tablets back in 2001 when they introduced the Tablet PC. That's not true, of course. Tablets go back at least another decade, and more if you count such concepts as the Apple "Knowledge Navigator" that was introduced in 1987, or earlier yet, Alan Kay's DynaBook of 1968. What's also mostly forgotten is that almost 20 years ago, the computing world was all hyped up over tablet computers that you could write on, slates that were sort of like "smart paper." None other than Microsoft's own GM of their Pen Computing Group stated that "the impact of pens on computing will be far greater than the mouse," and that was in November of 1991.

    See, back then, the buzz was building of pen computing as the next big thing, and pen computers were nothing other than tablets. Microsoft felt the heat because GO Corporation got a lot of press for its PenPoint OS that, unlike Windows, was totally designed for pens. GO released PenPoint in 1992, a company named Momenta released its own tablet interface, and a good number of tablet computers chose PenPoint, including the first IBM ThinkPad (yes, almost two decades before pundits made fun of the term "iPad," there was the ThinkPad, and later the WinPad). Microsoft battled back with Windows for Pen Computing, a version of Windows 3.1 that added a layer of pen functionality. An OS war took place in the early 1990s on such earlier tablets as the NCR NotePad, the Samsung PenMaster, the Fujitsu Point, the Toshiba DynaPad, as well as pen computers made by GRiD (courtesy of Jeff Hawkins who later founded Palm) and a gaggle of long-forgottens such as Dauphin, TelePAD, Tusk, and several others.

    Microsoft won that war back in the early 90s, and they did it the way they always do it, by sheer, brute force. Windows for Pen Computing outmuscled PenPoint on the major platforms via some highly publicized sales, but it was a Pyrrhic victory as tablets went nowhere, in part because Windows just wasn't suitable for tablets and in part because the hype was about (underperforming) handwriting recognition as much as it was about tablets. One by one, the majors dropped out -- NCR, Compaq, Toshiba, IBM, NEC. Some hung in there long enough to see the complex and limited pen version of Windows 95, but tablets were done for the 90s. When Palm showed that pens could actually be quite useful, Microsoft launched Windows CE, but the small CE-based handhelds built by all the major Windows licensees were just too limited to excite anyone.

    But those early tablet efforts were not entirely wasted. A small but resilient tablet computer industry survived and kept developing specialized tablet solutions for vertical market clients.

    The next big tablet push came in 2001 when Microsoft, mostly on Bill Gates' belief in tablet computers, directed the world's computer makers to support its Tablet PC project. There was a widely publicized build-up with all sorts of tablet prototypes that culminated in the unveiling of the Tablet PC platform in the fall of 2002. At Pen Computing Magazine (which spawned the present RuggedPCReview.com) we reviewed all those early tablets, including the Acer TravelMate C100, the HP Compaq Tablet PC TC1000, the Fujitsu Stylistic ST4000, the Motion Computing M1200, the Toshiba Portege 3500, and the ViewSonic V1100, and we summarized it all in Pen Computing Magazine's detailed 2002 Tablet PC specification table. What's immediately noticeable is that most of those marquee tablets were actually what came to be called "convertible notebooks" or "notebook convertibles."

    What had happened was that Microsoft had gotten cold feet about the mainstream appeal of tablets in mid-stream, and ordered Acer to come up with a convertible notebook design. By the time the Tablet PC was actually and officially unveiled, the emphasis was clearly on notebook convertibles. The media was only cautiously optimistic about the outlook for the by now not-so-tablet-anymore Tablet PC, and the market quickly decided it didn't make much sense to pay extra for pen functionality on convertible notebooks that made thick and clumsy tablets, if they were used as tablets at all. So that didn't go over too well.

    There were plenty of parties to blame for the 2001/2002 Tablet PC concept's lack of success. Microsoft's midstream switch to convertibles pretty much killed the belief in the tablet-only versions. Tablet products cost more without offering tangible benefits. Microsoft's marketing support was lacking, to put it mildly. By far the most important problem, though, was that Microsoft once again tried to put an only slightly adapted version of Windows on tablets. That approach didn't work in 1991, it didn't work with Windows 95, and it didn't work with Windows XP in 2002.

    Then nothing happened in the tablet market for a good many years. Nothing, that is, than a few vertical market vendors eeking out a living offering various vertical market tablets for special applications. After all, if you have the right software and you have to walk around on the job, it IS easier and faster to just tap on a tablet than setting up a notebook and crank up Windows.

    So then the iPhone happens in 2007 and dazzles the world with a smooth, elegant, effortless user interface, one that lets you tap and pan and swipe with just the slightest touch, and where you can use two fingers to smoothly zoom in and out or rotate things. What made it all possible was Apple's use of a capacitive touch screen, a technology that neither needed a special pen like the preferred digitizer technology of the Microsoft Tablet PC, nor a stylus like most handhelds and PDAs. Capacitive touch, while hardly new, made using the iPhone fun and easy, but no one anticipated what would come next, and that was the iPad.

    As stated in the opening paragraph, there was much criticism when Apple first announced the iPad. It wasn't computer enough, you couldn't run real software on it, it was just a big iPhone without the phone, and so on and so on. What those critics didn't realize was that the only reason the tablet form factor hadn't worked before was because the software hadn't worked before. Or more precisely, because Microsoft's insistence on "Windows Everywhere" was a big, colossal failure. One more time: Windows was designed to be used with a mouse. A mouse. Not a pen, and not fingers.

    So what's the first thing Microsoft does when capacitive touch is starting to look like a real good thing? It adds touch to Windows 7. Which meant that the few Windows-based computers that also have a projected capacitive touch screens could be operated with touch. Sort of. Sort of, because Windows 7 is no more a touch-optimized OS than any other version of Windows before it.

    The sheer predicament Microsoft was facing became evident during 2010. As millions of iPads were sold, Microsoft had nothing other than Windows 7 with the usual bit of pen support. This left the door wide open for Google, which had opportunistically positioned the Android OS they had purchased and developed as the platform of choice for iPhone rivals. Despite the flop of their own Google phone, the surprise success of the Droid helped Motorola get back on the (phone) map and quickly established the Android OS as the primary alternative for most non-Apple smartphones.

    Not surprisingly, while Microsoft waited out 2010, it became apparent that Android, like the iOS, could easily scale up to larger tablet form factors. This realization apparently caught Google somewhat by surprise as their Android development efforts remained firmly concentrated on smartphones. This didn't stop a growing flood of bargain basement priced Chinese iPad copies to use (or maybe abuse) Android in cheap hardware with resistive digitizers that made them almost impossible to operate. This certainly didn't help Android look good, but the software platform's ascension into tablets is a done deal nonetheless.

    Interestingly, despite lots of tablet announcements, nine months after the iPad went on sale, there's really only one halfway credible Android tablet out there, and that's the Samsung Galaxy Pad. I say "halfway" because the Samsung tablet only has a 7-inch display, thus placing it into a different category from the iPad.

    So where does that leave the booming and seemingly unstoppable (experts predict many tens of millions sold in each of the next few years) tablet market? In an interesting situation for sure. Let's look at some of the forces at work:

    First, almost no one wants to truly alienate Microsoft, and so Android may well find itself getting the "PenPoint treatment," referring to the situation almost two decades ago where a better-suited OS was muscled off tablet hardware by Microsoft. However, Google is an entirely different class of opponent than the underfunded PenPoint movement was back then. But Microsoft is different, too, and though Microsoft has lost a great deal of momentum, it still controls the desktop and most of the notebook market.

    Second, even if Microsoft were to somehow prevail against Android, they still need to face themselves. For decades now, Microsoft has been its own biggest enemy with their dogged determination to use the big old Windows OS everywhere, whether it was suitable or not. Sure, they deviated a bit here and there, but whatever they tried elsewhere (Windows CE, Auto PC, special versions of Windows, etc.) always was sort of half-hearted and primarily designed not to encroach on Windows proper. So I just cannot see how a version of Windows 7 or 8 retrofitted to sort of fit onto tablets would meet with much more success than Windows for Pen Computing or the Windows Tablet PC Edition.

    Third, there's a digitizer predicament. From the very dawn of pen computing, starting with the earliest tablets, virtually all serious tablet computers used an "active" digitizer, i.e. the kind that lets you write smoothly and accurately onto the surface of the display as if it were a sheet of paper. Active digitizers allow for very precise drawing, writing in "digital ink," and also for handwriting recognition (which really works much better than most give it credit for). Active pens do not need actual physical touch for the digitizer to know where the pen is, and that makes them great for popping up pulldowns or explanatory balloons and such before committing to a touch that might trigger an action. Problem is, capacitive touch cannot do that. Sure, you can write with your fingers, but not in any meaningful way. For that you need a pen.

    And the digitizer predicament doesn't end there. A lot of the tablets (and convertibles) sold into vertical and industrial markets are going to be used outdoors where there are pesky things like bright sunshine, all sorts of reflections, rain, snow, dust, physical impact, and people wearing gloves. Capacitive touch displays can handle some of those, but not all. Possible answers are offering a variety of optional digitizers, or a combination of them. Both approaches increase costs, and they have their limits. And the underlying OS platform determines what kinds of digitizers make sense. For example, you can operate Windows quite well with a resistive digitizer, but Android really needs capacitive touch. Anyone who needs to write or draw on a tablet needs either an active or a resistive digitizer, and won't benefit from the wonders of touch-based zooming, panning and swiping, unless touch is combined with either active or resistive technology.

    The final, and greatest, problem is that the iPad has irrevocably changed what users expect from a tablet. If you give someone a tablet these days, they simply expect to be able to quickly zoom in and out in a browser, and they use two fingers to do it. If that doesn't work, or only works poorly, well, why doesn't this work like an iPad? This, then, is the danger facing everyone who makes a tablet that looks just like an iPad: it must also work as well as an iPad. Or almost as well.

    We'll probably have some answers soon. We'll soon know if Microsoft's answer to the iPad will simply be putting Windows 7 on tablets, or if they've learned from past mistakes. We'll soon know how successful Android will be in making major vendors truly commit to it. And we'll soon know whether HP will seriously try to add another option with the WebOS they got when they bought Palm.

    It should be interesting.

    Posted by conradb212 at 8:30 PM

    September 3, 2010

    What are discrete graphics, and why would you need them?

    If you follow the mobile computing beat, you've probably come across the term "discrete graphics." What that generally means is a computer's graphics display capabilities that are a separate sub-system and not part of the motherboard or, more recently, processor. Why should you care?

    Because as with almost everything else in life, one-size-fits-all only applies to a certain extent. Most computers take the one-size-fits-all approach, offering a set of features and performance that is good enough for most intended applications. Most, but not necessarily all. In graphics, that means that your standard mobile computer can handle all the usual functions such as communications, browsing, office apps and most media. However, the one-size-fits-all graphics capabilities that come with a system may struggle with more demanding applications such as advanced 3D graphics, CAD, GIS or other graphics-intensive tasks.

    Discrete graphics, while uncommon in mobile systems, are a standard part of almost all desktop and many notebook computers. Starting with the earliest IBM PC, computers had separate graphics cards that handled the moving of pixels on a display. When users wanted more resolution on their IBM PCs to run Lotus 123 than standard 640 x 200 CGA, they popped in a Hercules graphics card that boosted res up all the way to 720 x 348 pixel and made charting faster and more impressive.

    Over time, graphics "cards" became increasingly powerful graphics subsystems that provided a very significant boost in capabilities and performance. Top of the line graphics "cards" can cost as much or more than the rest of the computer combined, and they often have their own cooling sinks and fans. Graphics subsystems in notebooks are usually less conspicuous and they can even be integrated into boards, but they are still often differentiators between low-end and high-end versions of the same computer.

    Now who needs "discrete" graphics? Not everyone. In the past, separate graphics subsystems or cards often offered higher resolution than standard built-in graphics. That's because old-style CRTs were able to support multiple resolutions. LCDs are different in that they are designed as a matrix of so and so many pixels, and that is the "native" resolution that results in the crispest picture. Most integrated graphics are more than capable of running a LCD in its native resolution, and since the LCD doesn't support higher resolutions, there is no need for a graphics card that can drive more pixels.

    However, resolution isn't everything. Over time, computer graphics have evolved into a science with numerous standards and technologies. That's especially true in the areas of shading, rendering and manipulating 3D objects. This goes way beyond simply making pixels appear on the screen in a certain color and brightness. Games, for example, can require huge amounts of graphics computing power to make objects and 3D action look as lifelike as possible. 3D modeling and visualization likewise can require vast amounts of graphics computing power.

    How does all of this affect mobile computing? Well, mobile systems cannot possibly accommodate top-of-the-line graphics for the same reason that they cannot provide top-of-the-line desktop performance: power and heat. So just like mobile systems must have a carefully designed balance between performance, weight, size, cost and battery life in their choice of processors, the same goes for their graphic sub-systems. Up to now, for the most part, the processor handled processing and its complementing chipset handled graphics. And the graphics part of those chipsets came from third parties that specialize on graphics, such as nVidia or ATI.

    Up to recently, the situation was that most mobile systems made do with the "integrated" graphics capabilities inherent in their chipsets. These designs share system RAM and have some limitations. Some higher end or specialized devices had more powerful graphics to speed up certain applications.

    With the advent of Intel's 2010 Core processors, the game changed somewhat because Intel integrated the GPU (graphics processing unit) right into the CPU package. Intel claims this improves efficiency, speed and stability while graphics chipmakers probably view it as an Intel land grab designed to assert even greater control without, however, being able to provide the graphics performance some customers require. Both sides have their points, but one thing hasn't changed: a separate "discrete" graphics sub-system will still outperform one-size-fits-all integrated graphics, and may also provide graphics functionality not included in a standard integrated system.

    But what does it all mean in the real world?

    RuggedPCReview.com recently had a chance to benchmark test two mobile computers that offered discrete graphics on top of whatever came integrated into the chipsets. One of them was the General Dynamics Itronix Tadpole Topaz, a high-end "COTS" (Commercial Off The Shelf) notebook designed primarily for military applications. It came with an nVidia GeForce 8600M GT graphics sub-system. The other was a Panasonic Toughbook 31 equipped with discrete ATI Radeon HD5650 graphics. Both machines ran Intel chips at a clock speed of 2.56GHz. However, while the Topaz uses an Intel Core 2 Duo T9400 processor without integrated graphics, the Toughbook employs an Intel Core i5-540M with integrated graphics that can either be turned on or off via BIOS settings.

    The two machines are not directly comparable as they address somewhat different markets. However, when comparing the GD-Itronix Topaz with a GD-Itronix GD6000 that runs the same processor but does not have discrete graphics, the Topaz substantially outperformed the 6000 both in 2D and 3D graphics benchmarks, and absolutely blew it away in an OpenGL benchmark, by about a factor of 12:1. Now, OpenGL (Open Graphics Library) refers to a cross-language, cross-platform API for 2D and 3D graphics and is widely used in CAD, simulations and visualizations. If a customer has applications that use OpenGL code, then having OpenGL optimized graphics is absolutely mandatory.

    While the Topaz used its nVidia graphics full-time, the Panasonic's discrete ATI graphics can be switched on and off. Why would one want to switch off presumably superior graphics? For the same reason why in a vehicle you wouldn't want four-wheel-drive or a turbo engaged all the time when you don't really need it. Such performance boosters for special purposes can have a very negative impact on fuel mileage, and that, for now, is no different with discrete graphics. Panasonic quotes up to 11 hours of battery life with the discrete ATI graphics off, but only 5 hours with them on. That is a big difference.

    So what do discrete graphics get you in a modern Core i5 machine like the Toughbook 31? Not surprisingly, in day-to-day use, you probably would hardly ever notice the difference. But as soon as you get into 3D graphics and such, the ATI boosted performance by about a third, a very noticeable difference. The real payoff, again, comes in with OpenGL, where things happen more than four times as fast. That's the difference between barely tolerable and actual, real work.

    Bottom line? For now at least, if your application requires speedy 3D graphics or includes a lot of OpenGL code, discrete graphics is almost a must. It's a bit of a dilemma as Intel is clearly trying to eliminate third party separate graphics and probably doesn't pay much more than lip service to easy integration of external GPUs. This uneasy relationship may or may not contribute to the steep drop in battery life with discrete graphics engaged, but if battery life is an issue, it's certainly good to be able to engage discrete graphics only when needed, or when the machine is plugged in.

    Posted by conradb212 at 5:10 PM

    August 19, 2010

    New Intel Atoms, and how Oracle is helping Microsoft

    So Intel has added two more processors to its ever growing family of Atom processor products with all its many branches and suggested applications.

    The new chips are the single-core Atom D425 and the dual-core Atom D525 both of which run at 1.8GHz, representing a small step up from the existing 1.6GHz D410 and D510. Thermal Design Power remains at 10 and 13 watts, and the stated quantity prices of US$42 and US$63 is also the same as that of the earlier chips (which, however, enjoy "embedded" status). There is one notable difference: the two new chips support DDR3 SODIMM, and Intel is promoting them for home and small business network storage devices.

    To put things in perspective, unlike the Atom N270 that made the netbook explosion possible and accounts for tens of millions sold, and unlike its N450 (and N455/N475) successor, the Atom "D" processors are the ones Intel targeted for "nettops," i.e. really inexpensive desktop PCs. From what I can tell, that strategy didn't pan out as there aren't too many desktop PCs that used the D410 or even the dual-core D510. Why? Perhaps desktops and notebooks are so inexpensive these days that consumers see no reason to get a machine with anything less than a "real" Intel chip (i.e. a Core 2 Duo or one of the new Core i3/i5/i7 chips). So one explanation for the new D425/D525 is that Intel is trying to salvage the Atom "D" by giving it DDR3 support, even if it's only for SODIMM, and targeting the chips at network storage systems, whatever exactly that is in real life.

    A bit more commentary about the current Atom situation: there are now no fewer than ten versions of the Atom "Z" processor, ranging from the anemic 800MHz Z500 to the considerably more powerful 2.13GHz Z560. On the market side, the vast majority of Atom "Z" processor-based products we're seeing are using the 1.6GHz Z530, which, after all this time, still seems to be deployed pretty much interchangeably with the Atom N270 (there are products that offer both N270 and Z530 versions, and there are some which switched from one to the other and vice versa). In real life, I've never actually seen a product that uses the Z550 or Z560, which is odd as even the Z540 gives the one Z540 product we benchmarked (the Panasonic H1 medical market tablet) a small but noticeable performance edge over the competition).

    But what about the new Atom "Moorestown" chips, a next iteration of the "Z" processors that should finally allow Intel to be competitive in the smartphone and such market? Well, apart from their announcement in May of 2010, we haven't heard another thing whereas ARM et al get all the publicity.

    So it's hard to figure out what to make of Intel's Atom efforts to-date. On the one hand, there are the millions and millions of netbooks sold, but while everyone loved the low, low prices of netbooks, few were ever dazzled by netbook performance, especially in the graphics area. With the iPad showing what all can be done with a nominally much slower chip (the 1GHz A4), and netbooks getting ever closer to low-end notebooks, it's hard to see where that's headed.

    Anyway, so what about Android? It's interesting to see what difference a week can make, and in this case the difference is the lawsuit Oracle threw at Google over Android. The suit is arcane and I am not even going to try to present details (it has to do with Oracle now owning SUN, which owns Java, and Android is supposedly using part of Java in a way Oracle does not approve of), but the mere fact that one 800-pound gorilla sues another 800-pound gorilla over a platform that up to that suit had almost unprecedented momentum is reason for concern. I mean, if you're a developer, you'd probably stop work and watch while Godzilla and Mothra duke it out. And while you're taking a breather, you may have time to cool off a bit over Android and realize that for now at least it's really little more than a smartphone OS, and that's it's already awfully fragmented. And also that businesses may find it difficult to trust things that come out of the current implementation of the Android App store.

    This may all blow over and the two may come to an agreement, but let's realize that Oracle is in an entirely different class than SCO who a few years ago tried to claim exclusive ownership of all things UNIX and Linux. It's hard to see everyone all of a sudden stopping to make Android phones, but if this escalates, you'll see a lot of the companies that announced Android-based "iPad killers" delay their plans.

    So who may be laughing all the way to the bank? Microsoft. Nothing could please Microsoft more than seeing Android derailed. Microsoft's own mobile plans are a mess at this point, at least as far as outsiders are concerned. There may at some point again be some sort of cohesive Microsoft mobile strategy and attractive product lineup, but there isn't one now. So the more time Microsoft has to communicate a clear plan and show some real products, the more likely it is to get back into the mobile game.

    And HP has a dog in the race as well. HP is already the mightiest computer company in the world, and its immediate fate is not affected by what happens to Android. However, HP is also the company that fumbled away the iPAQ brand and today has essentially no presence in the mobile/smartphone market. But they bought Palm and all of Palm's cool IP, and so if there ever was a time to make a strong push for WebOS, it's now.

    We'll see what happens, both with the Atoms and with Android. Who needs reality TV shows with all this stuff going on?!

    Posted by conradb212 at 4:58 PM

    August 11, 2010

    Android contemplations

    Off the cuff, the way I see it is that Android has a better than even chance of becoming the OS of choice for tablets and other mobile devices. Android is really nothing more than another Linux distribution, but one backed and sort of run by Google. Microsoft, of course, will make the usual argument of leverage and security and integration into other Microsoft products, but the fact is that Linux itself can be at least as secure as anything Microsoft makes. Just look at the Mac OS which is also Unix-based, and Unix is the basis of Linux.

    As is, Android is still very much a smartphone-oriented OS. But since it is just a shell on top of Linux (Google might object to that simplification), it can very quickly be adapted to almost any platform. For example, I simply downloaded a stable version of Android, created a bootable version on a USB key, and then booted some of the tablets and netbooks in our lab with it. The hardware never knew the difference and almost everything worked right off the bat, including WiFi. Adapting touch drivers and a few other things would be very simple.

    The argument against Android is the same that people use against Linux: it's in the public domain. The program you need most may have been written by some guy from Leipzig or Buenos Aires, and that guy may have decided to ditch the code and move to Nepal. The reason why Microsoft has a stable platform is because they control it all, and the reason why iPhone/iPad apps are so very cool and polished is because you ONLY see what Apple examined and approved.

    So Android's (and thus Google's) challenge will be to create the semblance of a strong, unified PRODUCT called Android, something people can rely on, and not something where a poorly written manual tells you that you first need to rebuild the kernel with the -fxuOie switches turned on for the app to run. That will be a challenge.

    However, none other than General Dynamics Itronix has just released a handheld running Android. That would indicate that Android may be ready for prime time. And even if it isn't, and many questions remain, there's so much buzz and there's Google behind it. That alone will give anyone who offers Android or talks Android a strategic advantage.

    Oh, and manufacturers offering both handhelds and tablets/notebooks would finally have the advantage of running the same OS both on hall their platforms, and not a mobile and a full version of an OS as has been the case with Windows and Windows CE/Mobile.

    Posted by conradb212 at 6:15 PM

    June 17, 2010

    Handheld Group Business Partner Conference 2010, Stockholm

    Much to my surprise, the Handheld Group invited me to do a presentation at their annual Business Partner Conference in Stockholm, Sweden. The Handheld Group is an international supplier of rugged mobile computers, including handheld terminals and tablets, and they've carved themselves a nice niche with a lineup that includes specialty devices as well as tailored solutions for variety of uses. The annual conference is meant to provide a venue to socialize with business partners and inform them on products, outlook and opportunities.

    I've always liked these sorts of conferences as they provide a great way to talk with executives and product managers, and see the latest lineups all in one place. So I accepted Handheld's invitation and prepared a presentation on "Trends and Concepts in Mobile Computing."

    Getting flights these days is a real pain. The standard fares are outrageously high and apparently geared towards business travelers with unlimited expense accounts. I have low fare alerts for most of the destinations I potentially travel to, but those can be an exercise in frustration as those low, low fares are hardly ever actually available. I ended up paying several times the teaser fares, and that was with long layovers and a schedule convenient for the airlines, but not for me. In an era where a web page in London, Tokyo or Stockholm loads as quickly as one next door, we tend to forget how very far away those places actually are.

    After landing at Stockholm's nice Arlanda airport I booked a ticket on the super-fast bullet train to downtown Stockholm, then, since it was a glorious morning, walked the 5K or so to the Elite Hotel Marina Tower where the conference was held. That was fun, though I was quite addled with jet lag, and the little wheels on my carry-on probably didn't like the cobble stone of old-town Stockholm much.

    Much to its credit, the hotel let this weary traveler check in at 10:30AM, and so I took a long nap in my nice hotel room that looked like right out of an Ikea showcase. Then it was off to registration and meeting my hosts. They were all friendly as can be, and I noticed that most Swedes indeed are blond. I met Sofia, my main contact at Handheld HQ, then Jerker Hellström, CEO and Chairman of the Handheld Group, and Thomas Löfblad, not blond, and thanks to a course of study in the US possessive of less of an accent in his English than I am after 30+ years. The two welcomed the assembly to the conference, introduced the business partners, and kicked off the cocktail party mingling.

    The Handheld folks did a great job making everyone feel at ease, and so I soon had interesting conversations with the mostly European attendees as well as some from as far as Australia. I found quite a few veterans of the old Husky Computers that was later bought by Itronix -- not surprising, as I learned, since the privately held Handheld Group had once gotten its start as the Scandinavian representatives of Husky. I got a chance to meet Daniel Magnusson and Nina Hedberg of RAM Nordic AB, which had the patented RAM Mount solutions on display; the Sacci folks with their numerous bags, harnesses and other clever ways of carrying around and using handheld computers; SIGMAX with their law enforcement and ticketing solutions; Brodit with their very clever mounting solutions; I got a demonstration of the impressive mobile device management solutions by The Institution, and spent time with all the other cool stuff there. I also had a chance to finally meet the HHCS Handheld USA team, including Mike Zelman, Dale Kyle, and my ever-helpful contact, Amy Urban.

    Given the 9-hour time difference compared to California, I slept remarkably well and woke up refreshed and ready for a day of conferencing. CEO Hellström gave an overview of the company, its total and exclusive dedication to rugged computing, and its pride in being the fastest growing IT company in Sweden (Handheld actually grew during the difficult year of 2009). Hellström described Handheld's "virtual production model" where the company's engineers generate specifications and design, then have the products made by a production partner, and launched either alone or with a partner. He highlighted the company's products and special solutions, as well as the newly introduced Algiz 7 rugged tablet.

    Following was an excellent presentation by David Krebs, who is the director of mobile and wireless research at VDC and a frequently quoted authority on all things mobile as well as a compatriot who grew up a few short kilometers from my original home in Zurich, Switzerland. David described the current mobile technology market as "in a state of rebound" after a serious setback in 2009. He pointed out that technology penetration in many mobile areas is still only 20, 30 or 40%, leaving plenty of potential opportunity, and predicted annual handheld revenue growth of 7.5% through 2014. David also highlighted the significant advantage of rugged versus non-rugged handhelds and tablets in terms of failure rates, resulting in substantially lower TCO (total cost of ownership), certainly a big selling point in the road to recovery.

    I had decided to go out on a limb and run my presentation on my iPad via Apple's iPad dock-to-VGA adapter. This worked just fine, using Apple's US$9.95 iPad version of Keynote, which is Apple's equivalent of Powerpoint. In my presentation I discussed some of the concepts and trends in mobile computing, ranging from processors, to outdoor viewable displays, to digitizers, operating systems, and emerging new technologies. Murphy's Law struck when, stunningly, the frame of my reading glasses broke right in the middle of my presentation, forcing me to continue using one hand holding up my gasses and the other hand to operate the iPad. Fortunately, I had a clip-on mic or else I'd have needed a third hand.

    After that Thomas Löfblad discussed the Handheld Product line that by now includes over a dozen state-of-the-art handhelds and tablets, as well as printers and accessories. All of the newer products are carrying Handheld's own Algiz (tablets) and Nautiz (handhelds) brand names.

    After lunch, Mr. Hellström discussed the product roadmap for the year ahead, with the full rollout of the newly introduced Algiz 7 tablet, a second generation Algiz 8, and a glimpse at an upcoming new product that will extend Handheld's line into a new class of devices. The company took the opportunity of the partner conference to get feedback and commentary on the new form factor, the proposed features, functionality and price.

    After face time with the new product, we heard about Handheld's plans on moving forward. Sofia Löfblad talked about how the company can support its partners with case studies, advertising support, loaners, product reviews, a special support website, and several other programs. Service and Support Manager Max Dahlbom then did a humorous, energetic presentation on service, warranty, care levels and support, all geared towards helping and supporting customers and stressing the importance of good service as a differentiator. Thomas Löfblad then addressed issues such as the impact of the weak Euro, the company's MaxFreight service, insurance issues and also some product updates.

    The final presentation came from Dean Lindsay, a motivational speaker and best-selling author (everyone attending got a copy of Dean's highly recommended book "The Progress Challenge") who, engagingly and entertainingly, talked about common sense concepts of attracting and fostering business and sales.

    What followed was an absolutely delightful four-hour cruise of the Stockholm waterways aboard the M/S Riddarholmen. We were greeted onboard with champaign, GPS-equipped Algiz 7 tablets were mounted in several locations and provided mapping and navigation data, food and drink were delicious, as was the varied scenery passing by.

    There was again ample opportunity to mix and mingle, compare notes, and talk with people from the Handheld Group as well as partners and customers. The weather played along with bright sunshine all conference long (which apparently no one expected), and then some dramatic clouds at dusk. Unusual for those of us not living in northern latitudes, it didn't really get dark until way late into the night, and daylight remained even after we got back to the dock at 11PM.

    I missed out on Stockholm sightseeing the next morning as I had to grab a cab to the airport for my trip back to California. Long though the flights back were, and the 8-hour layover at Chicago O'Hare, it gave me an opportunity to reflect on a side of business we often forget or take for granted, the people side. There are lots of great products out there, all able to do amazing things. But it takes people with vision and drive and competence to form companies that can pull it all together, picking a lineup of compatible products for a well-defined purpose, then marketing, selling, supporting and servicing those products. In the end, that's what it's all about, dealing with people you know and trust, folks who've been there and will be around, and who know their business. That's the impression I got from the Handheld Group. Good company, good people.

    Posted by conradb212 at 7:17 PM

    May 28, 2010

    4G

    Pretty soon everyone will be talking about 4G. Who has 4G and whose 4G is better or faster. Somehow, marketing from all wireless camps has latched onto the cool-sounding terms 3G and 4G, though they're avoiding "3.5G" or "3.75G" you often find in tech specs. That's probably because three and a half sounds like not quite four.

    Anyway, Sprint is now making noises about 4G and you can actually buy 4G smartphones using the Sprint network. Since Sprint is a bit in the ropes, being first may not mean all that much, but it's still good to know how things developed and where they are headed.

    A couple of years ago, a product manager from one of the rugged computing manufacturers asked me what I thought of WiMAX. At the time WiMAX was a buzzword for really fast next generation wireless. I told him it was probably too early to worry about it and it wasn't clear what would happen. This is still true in 2010, but WiMAX is now available as "Sprint 4G" though technically it's the same network as the CLEAR brand name 4G network from a company named Clearwire.

    Who's behind Clearwire? None other than Craig McCaw, the trailblazer who once compiled McCaw Cellular and then sold it to AT&T in the early 1990s. Where does Sprint fit in? Well, McCaw and Sprint both owned spectrum in the 2.5GHz (in fact, they own almost all of it) and decided to pool resources, with Sprint making additional investments until they were the majority stockholder of Clearwire.

    Does this mean it'll be clear sailing for Clearwire/Sprint in the emerging 4G arena? Not really. Problem is, this time both AT&T and Verizon are backing an alternate 4G technology called LTE, which stands for Long Term Evolution (how do they come up with these terms?!). LTE uses the 700MHz band and physics dictate that waves in that spectrum can go farther at lower power levels and also have less trouble being received in buildings. This potentially means that LTE, in addition to being backed by the two largest wireless providers, also costs less to deploy.

    For now, Spring and Clearwire claim that their 4G WiMAX network allows mobile download speeds of 3 to 6 mpbs with speed bursts over 10 mbps, and upload up to 1 mbps. Sprint and Clearwire sell both 3G/4G Mobile Hotspots (made by Sierra Wireless) and 3G/4G USB modems that look like standard USB memory keys. Clearwire offers unlimited usage for US$40/month plus a small lease fee for the modem (you can also buy it). That sounds like a good deal, but for now the map for that is quite limited.

    More speed is exciting, but at least based on my various devices that use 3G, I'd be thrilled to have reliable 3G coverage wherever I go before I jump to 4G.

    Posted by conradb212 at 3:57 PM

    May 19, 2010

    Intel vPro technology—what is it all about?

    If you follow chipmaker Intel, you know that the company not only loves code names, but also special technologies that are then used to market certain chips or chip families. At some point it was "with MMX" that made Intel Pentium chips special in hilarious commercials showing Intel engineers in astronaut suits. "Hyper-threading" was big for a while, and for the latest families of Core processors, Intel stresses "Turbo Boost." Another Intel technology that gets a little less attention is vPro, but vPro is now becoming part of the marketing message of some ruggedized mobile computing products that have been upgraded to include Intel's latest Core i5 and Core i7 processors.

    So what is vPro all about?

    vPro is an Intel technology platform that allows remote access to a PC regardless whether the computer is booted up or the power is even on. It is intended to allow remote management, monitoring and maintenance while maintaining strict security measures. While vPro componentry needs to be included in the processor, it's a platform rather than just a technology feature, one that requires a combination of chip, board, firmware and software. vPro also includes other Intel technologies such as Intel AMT (Active Management Technology), Intel Virtualization Technology, Intel's TXT (Trusted Execution Technology) and, of course, a network connection.

    While remote access and management of PCs is commonly available through software such as VNC, VNC alone may not be capable and secure enough for all corporate purposes. With vPro, VNC can still be used, but it is now the Intel AMT part that facilitates secure communication with the PC, and in conjunction with the whole vPro platform, it is not only possible to control a remote PC, but aso to start it up and—even more amazingly—log in and perform certain function even if the OS is corrupted or missing. That's because the vPro engine/platform is available at a very low system level.

    How can such vPro-based remote access be used? Well, there could be a system where dispatch sends job requests to a mobile computer in a filed office or a vehicle. The request will boot the computer if it is off, and then either perform a job or prompt an operator or driver to do the job and report back. It can then turn off the computer remotely, even shut down the OS. As long as the computer still has power, it remains remotely accessible (remotely waking up a PC is usually done via a hardwire LAN connection.

    Panasonic highlights vPro in the recent introduction of its Toughbook 31 rugged notebook that use the vPro-enabled Intel Core i5 processor. Panasonic even features a video that shows the use of vPro technology between a dispatch with a vPro console and a Toughbook-equipped service truck. It demonstrates how the remote console can wake up the Toughbook, run a job, then shut it down again.

    Motion Computing, too, stresses the advantages of vPro in their announcement of the upgraded Motion C5v and F5v tablet PCs, stating that vPro technology will enable their customers to experience enhanced remote management capabilities so IT can secure and/or repair a tablet from any location, even with power off.

    So that's vPro, a set of technologies to remotely access and control computers securely. While remote access and control is not new, being able to do it securely, and with power down and no OS booted can definitely come in handy. Not everyone will need or use vPro, and setting things up for remote access and management is not entirely trivial, and so most users will simply enjoy the very significant performance increases of Intel's latest Core i5 and i7 processors.

    Posted by conradb212 at 4:18 PM

    May 6, 2010

    "Moorestown" — Intel's new Z6xx Atom platform and how it fits in

    On May 4th, Intel introduced the next generation of its initial family of Z5xx Atom processor. Codenamed "Moorestown," the Z6xx family, together with a new I/O controller and signal processing chip are meant to make Intel competitive in the booming smartphone and internet access device market. On paper at least, the new processor family looks very good and may yet help Intel establish itself in the device market (which, interestingly, they abandoned when they sold the XSCALE application processor business to Marvell a couple of years ago). But before we go into details of Moorestown, let's backtrack and see how Intel's whole Atom venture began and developed.

    "Silverthorn" and "Diamondville"

    The Intel Atom processors have been around for over two years now. Initially, Intel launched two different product lines, the Z5xx "Silverthorne" processors geared towards mobile internet devices (MIDs), and the N2xx line of "Diamondville" processors for standard low cost PCs and netbook class devices.

    The Z5xx versions of the Atom processor had a 13 x 14 mm package footprint and used the also new “Poulsbo” System Controller Hub. The processor had about 47 million transistors—more than the original Pentium 4. Bus frequency was 400MHz or 533MHz, and the Thermal Design Power (TDP) was between 0.85 watts for a low-end 800MHz chip, and 2.65 watts for a 1.86GHz Z540 version. The chipset used about 2.3 watts, which meant total CPU and chipset consumption wasn’t even 5 watts, far less than any of Intel's standard mobile processors. And the chipset had hardware support for H.264 and other HD decoding (but required the appropriate codecs to take advantage of it!). However, as the combo was targeted for internet devices, there was only PATA and no SATA support, though SATA could be added.

    The Atom N2xx "Diamondville" family, released a bit later, was very similar to the Z5xx, so much so that to this date, I've yet to find someone who can convincingly describe why a manufacturer would pick one or the other, or what truly differentiates the two families. The Z2xx was a bit larger, measuring 22 x 22 mm, and the most popular model—the 1.6GHz N270—also had a, for Intel, very low Thermal Design Power of just 2 watts. The N2xx processors did not come with a newly designed chipset, but used lower power versions of the standard Intel 945 chipset and a separate ICH7M I/O chip. There was no HD decoding or hardware acceleration, but the chip did support the SATA interface.

    The initial Atom processor families did not use two cores for cost and power conservation reasons. Instead they used Intel’s older HyperThreading technique that can process two threads, yet increases energy usage by only about 10%. Intel also developed a more power-efficient bus and a cache that could be disabled when it was not needed. The Atom Z5xx further used a new "Deep Power Down" C6 state, and similar advanced power management was available in the Atom N2xx.

    What happened next was interesting. While Intel probably had high hopes for the Z5xx chips in the emerging "mobile internet device" market, it was the "Diamondville" processors, and more specifically the 1.6GHz N270, that almost singlehandedly created the new category of "netbooks" (well, the term had been used before, but never to describe a separate class of mobile computers). Despite the N270 chip's modest performance, consumers bought millions and millions of those little netbooks, most likely because of the low price that made netbooks an impulse buy as opposed to spending more for a "real" notebook computer.

    The N270, however, was the sole bright spot in the Atom lineup on both sides of the Atom family, as neither the desktop-oriented N230 nor the entire mobile internet device Z5xx family did much of anything. The Z5xx chips were used in some industrial products like computers-on-modules, small form factor CPU boards, industrial tablets (such as the Handheld Algiz 8, the Mobile Demand T7000, the Logic Instrument Fieldbook, or the WinMate I980), MCAs (such as the Panasonic H1 or the Advantech MICA-101), or clamshell UMPCs (such as the Fujitsu UH900), but by and large there seemed no truly compelling reasons to go with Silverthorne.

    "Diamondville" gets a little boost

    On the Diamondville netbook side, the problem with the Atom N270 was that despite being used in all those netbooks, it was barely powerful enough to drive even those small, inexpensive computers. Anyone trying to do video or games on a netbook came away sorely disappointed. As a stop-gap solution, Intel released the very slightly more powerful N280 (1.66GHz clock speed instead of 1.6GHz) for netbooks, and the dual-core N330, which was really a dual-core version of the little-used desktop N230. With Atom video performance lagging, NVIDIA came up with the NVIDIA Ion Graphics chipset that was supposed to work better with Atom N-Series chips than Intel's own chipset, but it didn't come in time to make it into any of the first generation netbooks.

    "Silverthorne" gets tougher

    For embedded computing, in March of 2009 Intel quietly expanded the Z5XX platform with larger form factor versions that carried a "P" in their name, and then a special "large form factor with industrial temperature options" version marked with a "PT." This added the Atom 1.1GHz Z510P and 1.6GHz Z530P as well as the 1.1GHz Z510PT and 1.33GHz Z520PT. The P and PT versions used a larger 22 x 22 mm package (which is the same size as the N2xx chips) that used a different "ball pitch"—the spacing of the little balls of solder that replace pins on the underside of these tiny processor packages. That was probably done because the 0.6mm ball pitch of the original Z5xx series required high density interconnects (HDI) on the printed circuit boards, and those are more difficult to do and also more finicky, not what you'd want in the kind of rugged devices the chips were actually used. As far as temperature range goes, 32 to 158 degrees Fahrenheit is considered "commercial," whereas -40 to 185 degrees Fahrenheit is considered "industrial." Interestingly, only the "PT" series processors support the industrial temperature range; the "P" series versions are listed with the same commercial temperature range as the initial chips.

    RuggedPCReview's assessment in 2009 was that "the moral of the Atom story is, at least for vertical market manufacturers: pick an Atom chip that Intel is likely to support for several years, and make certain the drivers are fully optimized and all the power saving features are fully implemented. Atom can deliver superior battery life and acceptable performance, but manufacturers must carefully target those products so customers won't be disappointed. We've seen Atom-based machines that use hardly less battery power than devices with much more powerful processors. That won't do. And we've seen some where non-optimized graphics drivers made the machines painful to use."

    "Diamondville" begets "Pinetrail"

    In December of 2009, Intel announced the next generation of Atom processors, or really the successor of "Diamondville." The new "Pinetrail" generation of Atom processors included the single core N450 (heir to the N270) and, adding yet another letter class, the single core D410 and the dual-core D510, both meant for low-end desktops. The big news here was that Intel reduced the chip count from three to two by integrating the graphics and memory controller into the CPU itself. The old ICH7M I/O controller chip was replaced with the Intel NM10 Express. That meant fewer chips to mount, somewhat lower power consumption, and—not mentioned by Intel—one less reason to seek third party chipsets such as NVIDIA's Ion. Reducing the chip count from three to two was nice, but the Z-series processors already had that. Graphics seemed somewhat improved, but not enough to make a huge difference, and there was still no HD playback hardware support. Our assessment was that we could not "help but feeling that Intel looked out for itself more than adding compelling value for consumers."

    So for now, the N450 and the slightly faster 1.83GHz N470 are taking care of the netbook market, but what of the ever more important MID and smartphone market that Intel tried to address with Silverthorne? By now it was very obvious that Silerthorne had zero impact on that market and no one was going to base a smartphone or anything like it on an Atom Z5xx chip. Intel might have suspected as much, as even in the early days of Atom, their roadmap included codename "Moorestown," a system-on-a-chip platform.

    Silverthorn replaced by "Moorestown"?

    Well, Moorestown was officially introduced on May 4th, 2010. It includes the "Lincroft" Z6xx series of Atom chips, the "Langwell" Platform Controller Hub MP20, and the "Briartown" Mixed Signal IC (yes, Intel loves its code names). In its press release, Intel mentioned "significant power savings while increasing performance" in a design scalable across a range of devices including "high-end smartphones, tablets and other mobile handheld devices."

    So what does Intel promise for the Z6xx platform? Nothing very specific as of yet. Power "breakthroughs" include much lower power consumption at idle and with audio active (i.e. music playing), and 2-3X reduction while browsing or playing video. That's good. Intel also promised a full 1080p video experience (really already possible with the Z5xx chips, albeit perhaps not "full"), with clock speeds up to 1.5GHz and low-power LPDDR1 memory for smartphones and 1.9GHz and faster DDR2 memory for tablets (current Z5xx series chips range from the 1.1GHz Z510 to the 2.0GHz Z550. Intel highlights that the new chips result in greater than 40% reduction in package area and a greater than 50% reduction in board area for the Z6xx and MP20, so their combined package real estate is less than 400 mm2, and the board area required less than 333 mm2. The new "Langwell" Intel Platform Controller Hub MP20 has a package size of 14 x 14 mm (same as Apple's A4) with a 0.5mm pitch and uses 65nm technology. That's down from the 22 x 22mm Poulsbo. The Z6xx chip itself is also on a 14 x 14mm package (see below).

    From an architecure standpoint, the new 1Z6xx CPUs integrate a lot of the functionality that used to be part of the Poulsbo chipset, such as graphics, decoding/encoding, memory controller, etc., leaving the "South Complex" "Langwell" chip to concentrate on I/O. The graphics core integrated into the "Lincroft" CPU is the same as that on the older "Poulsbo" chip, but the core can now run at up to twice the frequency and has been optimized for power and performance. Video decoding remains the same, but there's now 720p H.264 and MPEG4 encoding and also H.263 videoconferencing encoding. Intel says that 3D graphics performance should double.

    The "Briertown" "Mixed Signal IC" is meant to integrate components such as audio, touchscreen, voltage regulator, display and comms drivers and such. Intel stressed that it will be available from mutiple sources (such as Freescale, Maxim and Renesas).

    While more performance was desirable, less power consumption was essential if the new chips are to have a chance in the device market. So Intel did some major work on power states. Instead of the older system-wide power management, much greater power savings are now possible by giving each subsystem its own power management capabilities. So whenever any part of a "Moorestown" system is not needed, it's turned down or off. Intel refers to those savings mechanisms as "power islands" on both the MP20 hub and on the Z6xx chip and it's all done with an involved combination of software, hardware and firmware features. The sum total of all this is that the three chips that make up the Moorestown platform combined use less power under load than the first gen Menlow platform did when running idle.

    That's impressive, but also necessary. What Intel envisions for Moorestown-based devices is a range of form factors, from smartphones to sleek tablets with 10 days standby, two days music playback, over five hours of video, multi-tasking/multi-windowing/multi-point video conferencing, 1080p playback and 720p recording, and and "PC-like" internet.

    With Moorestown Intel is clearly taking another run at a market where it is simply not represented. Apple has set the bar for smartphones and tablets very high, and really nothing less than the kind of performance and battery life found in Apple products will do. The performance of current Atom-based systems, those assisted by NVIDIA chips not included, ranges from perfectly adequate to rather anemic, especially with video. It's Moorestown's task to potentially change that.

    NVIDIA likely won't be happy. Just when the first Atom N450/N470-based nebooks with its Ion graphics appear on the market, Intel throws another curve by including graphics into the very processor of the next generation of "internet device" Atom chips.

    For embedded systems and rugged/vertical market systems developers, the ongoing fragmentation of the Atom platform into two families, and the rapid obsolescence of the two most frequently used chips (the N270 and the Z530) is also not very good news. While more performance is always better, if the Moorestown platform turns out to be that much quicker and more economical, then products based on the older chips will have suddenly become a lot less desirable.

    Now what?

    As far as the future of Moorestown for smartphones and mobile internet devices goes, Intel will not only have to overcome ongoing confusion about their two Atom families, but it will also face formidable competition from the ARM processor architecture camp. That includes Nvidia's Tegra, the Qualcomm Snapdragon, TI's OMAP and others.

    And then there is Apple. The iPad's A4 chip, also ARM-based, is Apple-only and thus not direct competition, but with the iPad Apple has shown what is possible with a tiny processor running at just 1GHz. The iPad is uniformly seen as an excellent performer with very quick browsing and excellent video playback. The iPad does not only not need a fan; it simply never warms up at all, not even after runnning video for hours. And with the iPad's ability to run video for ten hours or more, Intel's set goal of "over 5 hours" of video looks modest at best.

    So Moorestown has a great deal to prove, and Intel has a lot to lose. If the platform succeeds, the N-Series branch of the family will suddenly look quite obsolete, which will require another tweak. If it fails, Intel's reputation of being behind in mobile chips will be confirmed yet again. No one's ever counting Intel out, but Atom, netbooks notwithstanding, has been a struggle.

    Posted by conradb212 at 10:42 PM

    May 4, 2010

    Publishing and the iPad

    This has nothing to do with rugged computing, but everything with publishing and how information is presented and distributed.

    As a former print publisher, I spent some time comparing different approaches to magazine publishing on the iPad. Given the amount of hype about the iPad being the savior of publishing, I am surprised there is not an iMagazine app or some such. I mean, Apple could take the lead here yet again, creating the iTunes of the magazine world.
     
    As is, everyone's doing their own thing, with Zinio, of course, having the lead with its hundreds of electronic titles. Problem is, they're not doing a thing different for the iPad. The PDF versions are faithful 1:1 equivalents of the print mags and it all works well, though a slight lag until each new page snaps into focus is annoying. And I am NOT willing to fill out long, cumbersome forms with address and credit card info to subscribe to a mag when it should all be 1-click.
     
    Time Magazine rolls their own, for now at the absurdly high price of $4.95 per issue. Their approach is sort of a hybrid between PDF and web design and totally new stuff. It's very innovative, but takes some time getting used to. On the other hand, there really is no need to simply transform print to screen, even if it's print retrofitted with electronic stuff (links, video, forms, etc.). 
     
    So Time is experimenting. Pictures that may be tiny in a magazine due to space constraints can be large, with text below it and you need to scroll down. When you zoom in to make text readable, pictures don't necessarily zoom with it; they don't need to. And how cool is it to have a full-page portrait of Lady Gaga or Bill Clinton and when you rotate to landscape, it becomes flawless high-definition video and they speak to you.
     
    The iPad brings us another step closer to electronic publishing, a big one. But for now, no one is taking the definite lead. With the iBooks app and iBook store still a million miles behind Amazon, Apple probably has its hands full with filling in the many blanks, and an iMag app and store may not come to pass anytime soon, or ever. So Zinio and others have a window of opportunity, but it'll take more than selling individual mags for US$4.95 (Time) or making people put up with lag and an antediluvian 20th century style signup (Zinio).  

    Posted by conradb212 at 5:54 PM

    April 9, 2010

    Waterproofing rugged computing equipment

    During the course of testing in the RuggedPCReview.com lab, we examine ruggedness specifications and claims. For the most part, while we report and comment on those specs, we do not put them to the test. That's because ruggedness testing is pretty involved business, and checking how much punishment a device can take before it fails makes about as much sense as a car magazine running a test vehicle into a concrete wall to see if it is indeed as safe as the manufacturer says.

    There are, however, exceptions. If a manufacturer claims their product can be dropped from four feet without damage, we may try that. And if a product is advertised as being waterproof, we may check that claim out as well. And this is where it gets interesting.

    Most rugged products have an ingress protection rating in their specs. If the IP code system is used, as defined by international standard IEC 60529, then the second number in the code indicates protection against water. An IP67, for example, means that the product is totally protected against dust (that's the "6"), and also protected against immersion into water down to one meter (3.3 feet) for up to an hour. IP68 means protection against continuous immersion, as specified by the manufacturer.

    So are there mobile computers that are waterproof? The answer is yes. There is a small, but not insignificant number of systems, primarily handhelds, that carry IP67 ratings. And the marketing for those systems often includes pictures or videos of full immersion. At trade shows you sometimes see waterproof handhelds or tablets sitting in tanks, running video to show that they are, indeed, alive and unharmed.

    Now it is abundantly clear that even machines that carry IP67 ratings are not dive computers and that few will ever even be immersed in water. However, given their intended use, they MAY fall INTO water, just as they may fall off a speeding pickup truck and get stepped on. Hence our occasional testing of the stated design limits and a bit beyond.

    That said, as a certified scuba diver with a good degree of experience, I've come across some pretty fascinating underwater electronics that are sealed. Diving is really interesting in that pressure plays a huge role. Each 33 feet of sea water (or 34 feet of fresh water) adds one atmosphere, or 14.7 psi, of pressure. You'd think that divers get crushed down at 100 feet, but that's not so because the human body is mostly water anyway (60-80%, depending on the individual), so all we have to worry about are the air spaces inside of us (lungs, sinuses, ears, mask mostly). We equalize pressure by breathing in pressurized air that perfectly counterbalances the water pressure. The result is that even at substantial depth, your dive mask doesn't leak at all; the flimsiest of seals will keep water out as long as there is no pressure difference and as long as there is indeed a seal that keeps air and water apart.

    This means that, theoretically, if there were a way to dynamically pressurize the inside of a rugged computing device, even very delicate seals (like the very thin silicone skirt of a dive mask) would be enough to keep water out even at great depth. Now obviously, no one is about to put automated compressed air pressure equalization systems into a handheld computer; that is not what such devices are for. It's interesting, though, to examine how underwater electronics ARE sealed:

    - Most underwater cameras use special housings that still allow access to the camera's controls. They usually have one big O-ring seal for the housing clamshell, and then individually sealed pushbuttons.

    - Recently, an exceedingly simple waterproofing method for cameras has come on the market. It simply consists of a sealed bag of clear plastic with a lens in it. It isn't protecting against pressure, but it sure keeps the water out.

    - Dive computers (the ones that compute nitrogen loading, depth, dive time, remaining time, etc.) are sometimes oil-filled. Since oil cannot be compressed, there are no pressure issues.

    - There are a number of waterproof cameras now that can handle up to 33 feet of water. Examples are the Olympus Tough series, the Canon D10, the Panasonic TS2 and more. We've tested most of those down to 50 feet, and had one down to 77 feet. Those are regular cameras with LCDs, battery and I/O compartments, and numerous controls. So it might be interesting for rugged computer engineers to take one of those cameras apart and see how they do it. (Btw, LCDs sometimes get compressed so that the image is temporarily impacted, and sometimes buttons are pushed in from the water pressure).

    What does all this mean for the waterproofing of rugged mobile systems? Mostly that a good understanding of pressure and sealing is required to design reliable waterproofing. Apart from the fairly complex issues of pressure, there's also a good deal of common sense. Keeping things as simple as possible is key. In a setting where ANY failure can be fatal to the equipment, it only makes sense to keep the potential points of failure as few as possible, and as simple as possible. It is not surprising that NASA has always been big on the concept of "fail-safe," i.e. systems that if they failed they failed so as not to jeopardize the larger purpose, such as survival of astronauts. Likewise, scuba regulators are designed so that if they fail, they free-flow rather than shutting off air, thus giving the diver a chance at survival.

    The conclusion is that the key to waterproofing of rugged computing systems is keeping things as simple as possible. This means keeping openings to the inside at a minimum, providing double protection whenever possible, and designing things to be as fail-safe as possible. Whatever seals there are must be totally reliable; resistant to twisting, ripping or falling out; durable; and easy to procure and replace. Seals should also be noticeable so users can see if something is amiss (we once failed to notice that a black O-ring in a black housing was missing, with nasty consequences). Rule #1 though is that the less there is to seal, the better.

    Posted by conradb212 at 8:52 PM

    April 3, 2010

    Finally: decent HD video on Atom boxes thanks to Broadcom card

    The dirty little secret of millions of Atom N270-based netbooks (and pretty much all other Atom-based systems) is that they really cannot run HD video. If you try it, you get choppy video that creeps along at frame rates of no more than 10 frames per second max even with just 720p video, let alone 1080p. This makes HD video on Atom-based systems impossible to watch. It's a huge disappointment for anyone who thought a "netbook" would surely be able to handle today's high definition media formats, and certainly an annoyance for many customers of vertical market Atom boxes as well.

    Well, third party to the rescue. And it's not nVidia (though that company's Ion technology will certainly improve the dire Atom platform graphics situation so that it becomes at least bearable). No, it's Broadcom which offers an inexpensive add-on card that can transform virtually non-existing Atom high-def decoding into something respectable and quite useful.

    The Broadcomm "Crystal HD" High Definition hardware decoder BC970012 with the Broadcom AVC/MPEG-2/VC-1 video/audio BCM70010/BCM70012 decoder chipset is a PCIe Mini Card designed to allow full high definition real-time decoding for hardware that otherwise could not do so. The board can decode H.264 480i/480p, 720p, and 1080i/1080p at 40Mb/second.

    I had read about the Broadcom solution last December, but never had a chance to check it out until an Atom N270-based Advantech ARK-DS303 digital signage player arrived at the RuggedPCReview lab. It had the Broadcom module installed as an option, since signage player customers may have a need for high definition video playback.

    To test the HD playback capabilities of the Broadcom Crystal HD decoder, I installed both QuickTime and the freeware Media Player Classic HomeCinema 1.3, copied a 250MB 720P high definition Quicktime (.mov) movie recorded on a Bonica HD video camera onto the Advantech player and then ran the movie side by side on an Apple iMac27 and a 22-inch display hooked up to the ARK-DS303, set to 1680 x 1050 pixel resolution. Amazingly, the DS303 kept up with the vastly more powerful iMac throughout the movie, with playback quality being almost identical. There was a very slight choppiness at times, but it did not materially impact playback.

    I then ran a full 1080p MPEG4 movie trailer on the DS303 and it never missed a beat. In fact, it almost ran better than 720p video. That is very impressive for a low-power Atom player with just a gig of RAM and no fancy hardware. By comparison, an Acer Aspire One netbook with basically the same hardware as the DS303 sputtered along at just a few frames per second. Our findings are confirmed by this test of the Broadcom card by SilentPCReview.

    On a personal level, what this means is that if you have a netbook that has an empty PCIe slot, you can get a Broadcom BC970012 board on eBay or from a company like Logic Supply (see Logic Supply BCM970012), download the drivers from Broadcom (see Crystal HD download page) and finally have decent HD video playback on your little underachiever.

    For manufacturers and resellers of rugged tablets and other mobile devices based on the Atom platform, and especially the N270, N280 and the new N450 (and probably the D510 as well), by all means make the Broadcom board available at least as an option! Due to its low cost, the Broadcom BCM970012 PCIe board is virtually a no-brainer for N270/N280 systems, and the newer N450 systems can definitely benefit from Broadcom's follow-up BCM970015.

    The video below shows the same 720p (1280 x 720) clip playing on an iMac27 on the left and on the Atom 270-powered Advantech ARK-DS303 with the Broadcom module on the right.

    I should mention that the situation is somewhat different for devices based on "Silverthorne" Atom chips, i.e. those with Z5xx Atom processors. Those actually havehard ware support for H.264 and other HD decoding. However, in order to take advantage of that capability, OEMs must include the necessary codecs, or users must run applications that come with those codecs (such as CyberLink or PowerVideo). For example, a Atom Z530-based Fujitsu LifeBook UH900 currently in the RuggedPCReview.com lab easily plays 1080p video at full frame rates.

    Posted by conradb212 at 4:38 PM

    March 31, 2010

    Will industrial tablets benefit from the iPad?

    On April 3rd, the Apple iPad tablet will be available in Apple stores. According to various reports, almost 300,000 iPads have been ordered before the device even became available. The hype is enormous, with experts falling all over themselves proclaiming why the iPad will succeed or fail.

    Fact is, at this point no one knows how the iPad will be received. Apple apparently felt comfortable enough with the tablet form factor to create the device and stake a good part of its reputation on it. Since the iPad is really a scaled-up iPhone rather than a pared-down MacBook, the question will be whether the iPhone experience indeed scales up to offer something the little iPhone just couldn't, or whether the larger form factor actually works against it as people may be more likely to compare it to a standard PC.

    One thing is for sure: the iPad will put the tablet into the harsh light of public scrutiny again. This, of course, isn't new. The original IBM ThinkPad of the early 1990s was a tablet, and then, just as now, the tablet/pad concept was sold as something millions were already familiar with: Scribble on a notepad or relax in a comfy chair with a tablet computer that feels like a book or print magazine. The major difference between then and now, apart from almost two decades of technological advancement, is that back then handwriting recognition was seen as the key to unlocking the tablet's potential.

    Unfortunately, handwriting recognition never quite worked out (though, with some training and given a chance, the software actually works very well), and current tablet efforts do not push recognition at all. Instead, the emphasis is on an attractive, elegant user interface with all the effortless swiping, pinching, bouncing and tapping that made the iPhone such a hit.

    Will the iPad benefit industrial tablets by bringing more attention to the tablet form factor? It's quite possible, but there are some pitfalls. The obvious one is that the iPad will set a standard of user expectations (effortless multi-touch, elegant UI, etc.) that Windows-based tablets will probably have a hard time to meet. Another is that Apple's effortless, elegant user interface requires a flawless implementation of projected capacitive touch technology, something that may work much better on a small consumer device than a large vertical market device that users may want to operate with gloves on.

    It's probably reasonable to assume that the iPad will raise expectations as to how tablets should operate. And raised expectations always mean that older technology will be viewed as lacking. So if users are driven to vertical market tablets just to find that they do not live up to expectations, the rejection and dissatisfaction will be more severe. Which means the overall impact of the iPad's publicity could be negative if vertical market tablets do not offer the same general improvements that the iPad brought to the consumer market.

    What's the implication? For that we need to take a look at prevailing digitizer and touch technologies.

    Almost since the very start of pen computing, Wacom's inductive technology has dominated the digitizer market with its precise, sleek pens that do not need a battery. The pen functionality of the Windows XP Tablet PC Edition, which was launched in 2002, was clearly designed for the Wacom pen, and the technology, for the most part, works very well. The primary problem with active pen solutions, though, is that you're dead in the water if you lose the pen, and despite tethering, spares, etc., it's just a matter of time until a pen gets lost.

    Which is why resistive touch technology is being used as an alternative to inductive, and often in conjunction with it (so that the computer automatically switches from one to the either when according to certain rules). The problem with resistive touch is that it is not very precise, not well suited for inking and recognition, and very prone to misinterpretation. After all, the digitizer must figure out how much pressure represents a "touch" and also differentiate between an intended touch (like from a stylus) and an unintended one (like the pressure of the palm of your hand). Resistive touch doesn't work very well with Windows and its tiny check boxes and scrollers, though legions of Windows CE/Windows Mobile users have learned to live with it.

    The respective shortcomings of inductive and resistive digitizer technologies led Apple to use projected capacitive touch, where a) it's either a touch or not a touch (no shades of gray depending on pressure), and b) multi-touch is possible. Combine that with Apple's interface magic and you have the elegant, effortless and seductive iPhone. Bingo, the perfect solution for tablets.

    Or is it? Over the past year we've seen multi-touch functionality added to a lot of tablets and notebooks. I've tried several of them, and none worked very well. Those systems would have a few demo showcase functions, but capacitive touch and multi-touch really did not make the systems easier to use, and they were just another feature rather than the main mode of operation (which is what makes the iPhone such a hit). So simply "having multi-touch" is not enough. It may even work against a product.

    This morning I came across a press release from Xplore Technologies, one of the earliest supporters and providers of vertical market tablet computers, where its president, Mark Holleran, lauds the launch of the iPhone as a great opportunity for the tablet form factor. Holleran points out the ease of use of tablets and views the launch of the iPad as a sign that "the tablet PC industry is poised for wider acceptance and accelerated growth."

    I do think Holleran is on to something, but even if the iPad is a rousing success, it'll still be a challenge to translate the iPad/iPhone user experience into an equally satisfying solution for vertical market tablets.

    Posted by conradb212 at 4:38 PM

    March 17, 2010

    Consumerization of rugged markets?

    A few weeks ago I wrote an article on Windows Mobile and the vertical markets and concluded with the question, "So what will the small but significant number of vendors who make and sell Windows Mobile devices do as their chosen operating system platform looks increasingly dated and is becoming a target of customer dissatisfaction?" I got some good (and rather concerned) feedback on that column, and I think it's an issue that is not going to go away.

    Yesterday I saw an article entitled "Delays Decimate Microsoft's Enterprise Mobile Market Share" at channelinsider.com, and they asked, "So, what of the rugged device market? A market largely dominated by bulky devices running Windows Mobile from manufacturers like Motorola and Intermec. Howe (Director of Anywhere Consumer Research at Yankee Group) says that enterprise applications are becoming more and more prevalent on consumer-grade smart phones, and the rugged hardware manufacturers will become more and more niche-focused."

    What they're saying is that enterprise and vertical market functionality is increasingly becoming available in inexpensive, standard consumer products, and the people who use that functionality do not want to walk around with two phones or handhelds. That's been pretty much accepted for a while now, as evidenced by the number of ruggedized handhelds that have integrated phones. The problem, though, as Howe puts it in the channelinsider.com article, is that "No one wants to go around looking like a UPS guy when they are out at the movies."

    And perhaps an even bigger problem is that no one, including the UPS guy, wants to put up anymore with a clumsy, recalcitrant user interface that fights you every step along the way. Not when the iPhone and Android and Palm have shown us that it can be done so much better.

    What will happen? I honestly don't think that Microsoft's mantra that handhelds need Windows CE because it leverages enterprise expertise washes anymore. Not when the handheld platform has been neglected to the degree Windows CE has been neglected. It's much more likely that Windows CE is still alive on vertical markets because it's a leap of faith to trust Apple or Google or open source to care about the relatively small vertical markets (even though some sales, like UPS, can be in the hundreds of thousands).

    Yet, the fact is that I can now take an iPhone, which doesn't even have a scanner, and scan barcodes with its built-in camera. Or take pictures with it that are far better than anything I've seen out of the integrated cameras on Windows CE devices. And due to the laws of physics, small devices are often inherently more rugged (and easier to ruggedize) than large ones. Does that mean we'll soon see slightly modified smartphones do the job of rugged handhelds?

    Probably not, but the thought definitely enters the mind.

    Posted by conradb212 at 8:31 PM

    March 10, 2010

    Will the iPad replace my iPhone?

    I wrote this column for the blog at iPhoneLife Magazine, a terrific resource for iPhone owners (or anyone interested in the iPhone) that's published by my old friend Hal Goldstein who used to be a friendly competitor when we published the print version of Pen Computing Magazine.

    The article really has nothing to do with rugged computing, but I think it's relevant here anyway because a) the fate of the Apple iPad will have a big impact on how tablets are viewed in the coming years, and b) because of the mobile industry's never-ending struggle to find form factors that are really right for a given job.

    So here's what I contemplated:

    This week I will order my iPad. Though I know it'll take a bit longer, I am aiming for the 3G model with 32GB of storage. When I get it, I will sign up for the unlimited data plan, forking over an even larger part of my disposable income to AT&T every month. What I do wonder is whether the iPad will replace my iPhone.

    Silly question you may say. The iPad is not a phone, so how could it replace the iPhone? True, but I really don't consider my iPhone as primarily a phone. It is, in fact, a pretty crappy phone, with voice quality worse than virtually any cellphone I've ever had, going back to the original Motorola "brick." But I do need a phone for the few calls I make, and it doesn't make sense to carry a much more convenient little fliphone in addition to the iPhone, and so, yes, the iPhone is my phone, too. But if I checked the number of minutes I use my iPhone as a phone versus for everything else, everything else would account for about 95%, at least.

    That's because the iPhone has pretty much become my information and entertainment device of choice. Before I leave the house I check the weather and temperature on the iPhone so I know what to wear. I get my news from the iPhone's USA Today and CNN apps (and even a couple of local and foreign newspapers), and more detailed news from the NY Times on the iPhone. I keep in touch with my Facebook friends on my iPhone. I read e-books on it. I play games on it. I use it when I go running and want to keep track of my time. I use it to check prices and read reviews while shopping. I check sports scores, the load on my servers, new messages on websites I post on. I do all that on my iPhone because it's so darn handy and convenient, and because it is good enough to do all those things. Had anyone told me a few years ago that, yes, it WILL be possible to use the web on a tiny device not as just a technology demonstration, but because it really works, I probably would not have believed it. After all, everyone had tried and it just didn't work. Until the iPhone.

    So now the iPad will do everything the iPhone can, but on a much bigger screen. No more squinting, no more screen rotating to make columns more easily readable, no more constant pinching to zoom in and out. That will all be a thing of the past as what we have all been waiting for is now here with the iPad, the book/magazine reading experience in an electronic device. Because that is the one remaining hang-up that keeps print newspapers and mags in business; they are more convenient than reading on a laptop computer.

    But now I wonder if the iPad will do everything the iPhone can, and do it better. Will I appreciate the much larger screen, or will it simply make the iPhone experience big and unwieldy? Will I have much higher expectations from a "real" computer like the iPad than I have of the little iPhone? For example, will I still tolerate the lack of Flash on the iPad? Will iPhone apps still look so terrific and clever on a much bigger screen, or will I expect real computer functionality? Will I start whining about the lack of "real" software? But most importantly, will I be able to use the iPad like I use the iPhone, just whipping it out wherever I am? Because if not, it may not work, and the new big iPhone will be something else that'll have to fly, or fail, on its own merits.

    Posted by conradb212 at 10:36 PM

    February 24, 2010

    Windows Mobile and the vertical markets

    While Windows Mobile pretty much has ceased to be a factor in consumer markets, it remains very firmly entrenched in industrial and vertical markets where it may have a market share that's probably larger than that of Windows in desktops and notebooks. The good news is that as long as Microsoft continues to dominate the desktop, the leverage of Windows programming tools and expertise will probably all but guarantee a continuing role for Windows CE and Windows Mobile. That said, the rapid vanishing act of Windows Mobile in the consumer markets simply must be disconcerting to those whose business depends on Windows Mobile.

    I won't go into the long and checkered history of Windows CE here, nor into Microsoft's bewildering meandering with nomenclature or the disruptive inconsistency and frequent course changes. It all has become an almost impenetrable mess even for longtime followers of Microsoft's smallest OS. Unfortunately, Windows Phone 7, Microsoft's latest knee jerk reaction to a changing smartphone market that has essentially relegated Windows Mobile into insignificance, casts more doubt and shadows on Windows Mobile than ever.

    While in the real world, the one where manufacturers make and sell rugged mobile products, we continue to see Windows CE 5.0/6.0 and Windows Mobile 6 and 6.1, in the hype and announcement world, Microsoft has announced Windows Phone 7 OS, a trendy me-too music player interface trying to leverage the floundering Zune music player platform while copying iPhone and social networking concepts. It's hard to see how Windows Phone 7 could make a dent into the smartphone market, and it is most certainly not the future in commercial and vertical markets.

    So where does that leave vertical markets who probably aren't thrilled at the prospect of being stuck with an increasingly obsolete Microsoft mini OS? Not in a very good position. Let's face it, the odd Windows Mobile 6.5 interface is essentially unsuitable for vertical markets. And now that Microsoft, scrambling to remain relevant in the mobile market, is putting its eggs into the projected capacitive (multi) touch basket, it's hard to see how any of the older versions of Windows CE/Windows Mobile (which is now renamed to "Windows Mobile Classic") may benefit from the Windows Phone 7 OS. Yet, something with "7" in it must happen to at least give the impression that Windows Mobile is moving forward (and to benefit from the relative shine of Windows 7).

    So we have a situation where only last year, Microsoft's entertainment and devices division president Robbie Bach waxed enthusiastically about WinMo 6.5 ("It will give you access to more websites than you will be able to get to on an iPhone that will work actively and work well. It really is a much better experience.") and now the future of 6.5 already seems quite uncertain.

    Personally, I think what may happen is that Microsoft will quietly integrate Windows CE/Mobile into its Windows Embedded Products business. That area already includes Windows Embedded CE in addition to Windows Embedded Standard, Windows Embedded Enterprise, Windows Embedded POSReady, Windows Embedded Server, Windows Embedded NavReady, and so on. The stated purpose of Windows Embedded CE is to "develop small footprint devices with a componentized, real-time operating system. Used in a wide array of devices, including portable navigation and communications devices." That makes sense.

    One problem with this approach is that one part of the appeal of Windows CE/Windows Mobile was always that people were already familiar with its look and feel. Today, that look and feel is ancient, just as are the very visible underpinnings of Windows CE that essentially date back to the last millennium. And with Windows Mobile Pocket PCs gone and Windows Mobile phones irrelevant, that part of the leverage is gone as well. A new interface approach is sorely needed, but if Windows Mobile 6.5 and Windows Phone 7 are any indication, Microsoft's thrust is in the Zune player and social networking arena.

    So what will the small but significant number of vendors who make and sell Windows Mobile devices do as their chosen operating system platform looks increasingly dated and is becoming a target of customer dissatisfaction? That's a good question. You can never count Microsoft out, but after all the fumbling with their mobile OS over the years, hopes for a cohesive, logical and compelling direction for Windows Mobile seem optimistic.

    Posted by conradb212 at 7:45 PM

    February 16, 2010

    A look at Intel's new Core i3/i5/i7 processors and how they will affect rugged computing

    Just when most manufacturers of rugged mobile computers have switched from earlier platforms either to Intel Atom or Core processors, Intel raises the ante again with new Atoms and the next generation of Core processors. In essence, the Core 2 Duo that has served the mobile community long and well is being replaced by a next generation of mobile chips with higher performance, newer technology, better integration, improved efficiency, and smaller package sizes.

    The new Intel Core i3, Core i5 and Core i7 processors come in numerous versions with two or four cores, clock speeds ranging from 1.06 to 3.33 GHz, maximum power dissipation of 18 to 95 watts, different process technologies, different degrees of integration and different complementing chipsets.

    Unfortunately, while the difference between Intel's older Core 2 Solo and Core 2 Duo processors was pretty obvious, differentiating between the Core i3, Core i5 and Core i7 chips can quite confusing. As a rule of thumb, the 3/5/7 sort of represent Intel's "good," "better," and "best" processor solutions in any given category just like BMW makes 3, 5, and 7 series cars (though the analogy only loosely applies). Core i3 processors, for example, do not have the Intel TurboBoost feature that provides extra performance via automatic overclocking and seems more than just a marketing feature. Core i7 processors generally have more cache and support more of the special Intel features and technologies than Core i5 and Core i3 processors. There is, however, more than a bit of overlap in functionality and performance, and figuring out which one is best suited for a task won't be simple.

    As of February 2010, Intel has announced about three dozen of the new Core i3/i5/i7 processors. About a third of them are designated as embedded processors, which makes them especially interesting for for embedded systems designers due to considerations such as package size, structural integrity, error correcting code memory, system uptime as well as, and perhaps most importantly, an 7-year extended life cycle.

    Intel has always had an excessive fondness of code names, and it's no different with the new Core processors. It's therefore useful to know that Intel distinguishes between generally desktop-oriented "Piketon" platforms that use either two core 32nm "Clarkdale" processors or four core 45nm "Lynnfield" processors (and usually have TDPs that makes them unsuitable for most mobile applications), and mobile "Calpella" platforms that use two core 32nm "Arrandale" processors with lower thermal design powers (generally 18 to 35 watts).

    So let's take a quick look at Calpella and Piketon.

    In essence, "Calpella" is Intel's new "low power" platform, though there is now a much sharper differentiation between the really low power Atoms and the low power but rather high performance new Core processors. The Calpella class "Arrandale" CPUs are based on the latest 32nm lithography. They are are essentially the successors of the mobile Core 2 Duo CPUs and come in standard, low voltage, and ultra low voltage versions at various processor clock speeds.

    There are, however, some interesting differences: As a first in this class of Intel CPUs, the memory controller and reasonably powerful integrated graphics with HD hardware acceleration and other new capabilities are now part of the processor, which means no more conventional Front Side Bus and "Northbridge" part of the chipset complementing the processor. These integrated graphics can be turned off when they are not needed, and Nvidia (who is probably not that thrilled about this Intel move) has already announced their "Optimus" technology (see what it is) that automatically determines whether to use the integrated graphics and extend battery life, or use an external NVIDIA GPU to boost graphics.

    Like the Core 2 Duos, "Arrandale" processors have two cores but the new chips use Intel's HyperThreading technology that act like virtual cores, making the operating system think it is dealing with four cores. The new chips also add L3 cache while the Core 2 Duo chips only had L1 and L2 cache. They require DDR3 RAM that supports higher speeds (up to 1,333MHz). An interesting new technology is Intel "TurboBoost" that automatically steps up processor core speed if it detects that the CPU is operating below certain power, current, and temperature limits.

    The new embedded Intel Core i5/i7 processors range from ultra low voltage models with a base clock frequency of 1.06GHz and a Thermal Design Power of 18 watts to low voltage models with base clock frequency up to 2.0 GHz and TDPs of 25 watts, and standard voltage models with base clock frequencies as high as 2.66GHz and 35 watt DTP. This means they're suitable primarily for higher end, high performance rugged notebooks and tablets, but not for lower end systems that require the still significantly lower power draw of Atom-based prcessor technology (or emerging alternate solutions such as the Nvidia Tegra).

    "Piketon"-class processors also include a variety of Intel's new Core i3, Core i5, and Core i7 CPUs but unlike the "Arrandale" versions they are mostly standard voltage, higher-powered chips that include both older 45nm technology "Lynnfield" versions of the chips (four cores but no integrated graphics) as well as newer 32nm "Clarkdale" versions with two cores and integrated graphics. These are performance-oriented processors with TDP ratings of 73 to 95 watts and thus unsuitable for most mobile applications.

    What about performance? We haven't had a chance at benchmarking any rugged systems with the new processors yet. Intel and other benchmarks of all new Core processors suggest a hefty 30-60% performance increase over equivalent Core 2 Duo processors at roughly the same TDP levels. Literature and previews also suggest that the performance of the integrated graphics processor is improved by perhaps about 50% over that of the predecessor GM45 platform. There is also said to be better 3D performance, high definition video hardware acceleration, audio and other advancements.

    It should be interesting to see who is first in making the new i5/i7 chips available and how they'll work out in rugged systems.

    For a comparison table of all Intel Core i3/i5/i7 released through January 2010, see here.

    Posted by conradb212 at 6:22 PM

    January 28, 2010

    Talking with Paul Moore, Fujitsu's Senior Director of Product Development

    The other day I had a very interesting hour-long conversation with Paul Moore, who is Senior Director of Mobile Product Development at Fujitsu. The call was arranged by Fujitsu's ever helpful Wendy Grubow to give me a chance to talk with Paul about the Fujitsu Lifebook T4410 Tablet PC that's currently in the RuggedPCReview.com lab for evaluation and testing.

    Fujitsu, of course, has been into tablets longer than most and probably has the most experience of any Tablet PC and convertible vendors. Fujitsu had the PoquetPAD and 325Point tablets a decade before IBM reinvented the Tablet PC in 2002, and the company is now in something like the 40th generation of tablet technology. Yes, the 40th. During the 1990s, Fujitsu built a successful business around vertical market slate computers, most notably the Point and Stylistic models, with the latter line carrying on to this day. For a while Fujitsu also offered Windows CE-based devices such as the PenCentra line. Fujitsu also offered small business-oriented notebooks with pens when almost no one else did. What it all boils down to is that there's no one who has more corporate DNA in tablet and slate computers in any number of form factors.

    Paul pointed out that at this point, Fujitsu is the only company that offers both slate AND convertible computers. There are many that have a notebook convertible in their lineups, such as Dell and HP, and there are some that only offer tablets, such as Motion Computing, but no one offers both in their market (one could argue that DRS ARMOR and a couple others do offer both platforms, but those are in the heavily rugged markets).

    Anyway, it was interesting to hear Paul tell that Fujitsu is seeing a heavy migration from tablet to convertible. Customers are transitioning from the Stylistics to the more conventional Lifebook convertible notebooks that can also be used as slates by rotating the display and laying it down flat on top of the keyboard. That probably explains why Fujitsu is now down to one single model in the Stylistic line, the Stylistic ST6012, whereas the company offers no fewer than six different convertibles (the Lifebook T1010, T1630, T2020, T4310, T4410, and T5010).

    With Panasonic making a big issue out of their rugged computers still being made in Japan, I asked Paul if the Fujitsu tablets and convertibles are also still made in Japan. The answer was yes, all Lifebook tablets are made in Japan, and all E-Series machines as well. However, while with Panasonic it was pretty clear that they made a connection of made in Japan = much lower failure rates, Fujitsu makes no such claim. Paul said failure rate stats are compiled, but given the vast differences in markets served makes any meaningful comparison essentially impossible.

    I asked Paul why Fujitsu does not market its computers as "business-rugged," "semi-rugged," or one of the other ruggedness categories. The unequivocal answer: We don't have rugged tablets. Ours are durable, well-built, according to the markets we serve. We don't lose many customers because of ruggedness requirements. Fair enough. Full or even partial ruggedness can add a lot of cost and weight, so if it is not needed, why add it. Paul points out useful features that prolong the life of a computer, like a user-cleanable dust filter, accelerometer-based hard disk protection, a display hinge that rotates in both directions so it won't get damaged by inadvertently turning it the wrong direction, and so on.

    With reference to the rotating display hinge, I asked Paul whether he knew why all Tablet PCs since 2001 have been designed with the same exact rotating hinge that lets users rotate the display and then fold it flat on top of the keyboard, LCD facing up. This is a good solution, but in notebook mode, the display flexes when you tap it with the pen. In the 1990s there had been several alternate solutions that minimized or eliminated the flex problem, but they are all gone. Paul said he wasn't aware of any patent protection or other reason why designers should be limited to the rotating displays, but it's a solution that works, flexing is not an issue when the device is used in tablet mode, and with the increasing importance of touch, flexing again is not an issue. Cost, too, might be an issue in staying with standardized solutions.

    We also discussed the inherent suitability of a full desktop operating system for tablet and touch use. In my opinion, Windows itself has always been a major factor standing in the way of widespread tablet adoption; it's simply not suitable for pen operation. Paul felt that Windows 7 has made great strides towards better usability, but that in vertical markets it's really all about custom applications anyway, and those are usually optimized for whatever input medium is used.

    With the recent advent of Intel's new Piketon and Calpella processor/chipset platforms I asked Paul what Fujitsu's plans were for the Intel Core i3/i5/i7 processors. His answer was that, for the most part, they prefer to use standard voltage processors that generally cost less, offer better performance, and represent an overall better value for users. Based on the benchmark result of our review unit that's equipped with a 2.53GHz Core 2 Duo P8700 with a thermal design power of 25 watts, we see no immediate reason for a chip upgrade: the T4410 scored the highest overall performance results of any Tablet PC we have ever tested, and it still had an idle power draw of just 9.9 watts, barely more than most Atom-based systems.

    Posted by conradb212 at 6:35 PM

    January 26, 2010

    Tablet hype at fever pitch

    A day before an Apple event where Steve Jobs will announce a new computing device, the hype about tablets is at an absolute fever pitch. Experts are popping out from the woodwork, showering us with their wisdom and predictions, most apparently believing that Microsoft invented and introduced the tablet in 2001, which couldn't be farther from the truth. But, perhaps, if enough instant experts say it's so, history has been rewritten. What will those instant experts do when they discover that the original early 1990s IBM Thinkpad was a tablet, and that we had the same exact tablet hype back in 1989/92?

    That said, if Apple indeed releases a tablet device, it may well change things quite a bit.

    Posted by conradb212 at 5:08 PM

    January 7, 2010

    Slate and tablet computers: learning from the past

    According to CNN, tablet-sized computers are now "a much-hyped category of electronics." True. The Associated Press says, "Tablet-style computers that run Windows have been available for a decade." Yes, and a lot longer than that. And a PC World editor states, "Tablet PC's are not new. The slate form factor portable computer has been around for almost a decade, since Microsoft initially pushed the concept with its Windows XP Tablet PC Edition." Nope. Microsoft did not initially push the concept with the XP Tablet PC Edition. Microsoft released a tablet OS way before that, in 1991, and even then it was just a reaction to what others had done before.

    This shows how soon we forget. Or perhaps how effective current coverage has been in creating the impression that Microsoft invented tablet computers in 2001, rewriting history in the process. Fact is, slate and tablet computers have been around for a good 20 years, and in 1991, there was as much hype about slates as we have today.

    A bit of slate computer history

    In the late 1980s, early pen computer systems generated a lot of excitement and there was a time when it was thought they might eventually replace conventional computers with keyboards. After all, everyone knows how to use a pen and pens are certainly less intimidating than keyboards.
    Pen computers, as envisioned in the 1980s, were built around handwriting recognition. In the early 1980s, handwriting recognition was seen as an important future technology. Nobel prize winner Dr. Charles Elbaum started Nestor and developed the NestorWriter handwriting recognizer. Communication Intelligence Corporation created the Handwriter recognition system, and there were many others.

    In 1991, the pen computing hype was at a peak. The pen was seen as a challenge to the mouse, and pen computers as a replacement for desktops. Microsoft, seeing slates as a potentially serious competition to Windows computers, announced Pen Extensions for Windows 3.1 and called them Windows for Pen Computing. Microsoft made some bold predictions about the advantages and success of pen systems that would take another ten years to even begin to materialize. In 1992, products arrived. GO Corporation released PenPoint. Lexicus released the Longhand handwriting recognition system. Microsoft released Windows for Pen Computing. Between 1992 and 1994, a number of companies introduced hardware to run Windows for Pen Computing or PenPoint. Among them were EO, NCR, Samsung (the picture to the right is a 1992 Samsung PenMaster), Dauphin, Fujitsu, TelePad, Compaq, Toshiba, and IBM. Few people remember that the original IBM ThinkPad was, as the name implies, a slate computer.

    The computer press was first enthusiastic, then very critical when pen computers did not sell. They measured pen computers against desktop PCs with Windows software and most of them found pen tablets difficult to use. They also criticized handwriting recognition and said it did not work. After that, pen computer companies failed. Momenta closed in 1992. They had used up US$40 million in venture capital. Samsung and NCR did not introduce new products. Pen pioneer GRiD was bought by AST for its manufacturing capacity. AST stopped all pen projects. Dauphin, which was started by a Korean businessman named Alan Yong, went bankrupt, owing IBM over $40 million. GO was taken over by AT&T, and AT&T closed the company in August 1994 (after the memorable "fax on the beach" TV commercials). GO had lost almost US$70 million in venture capital. Compaq, IBM, NEC, and Toshiba all stopped making consumer market pen products in 1994 and 1995.

    By 1995, pen computing was dead in the consumer market. Microsoft made a half-hearted attempt at including "Pen Services" in Windows 95, but slate computers had gone away, at least in consumer markets. It lived on in vertical and industrial markets. Companies such as Fujitsu Personal Systems, Husky, Telxon, Microslate, Intermec, Symbol Technologies, Xplore, and WalkAbout made and sold many pen tablets and pen slates.

    That was, however, not the end of pen computing. Bill Gates had always been a believer in the technology, and you can see slate computers in many of Microsoft's various "computing in the future" presentations over the years. Once Microsoft reintroduced pen computers as the "Tablet PC" in 2002, slates and notebook convertibles made a comeback, and new companies such as Motion Computing joined the core of vertical and industrial market slate computers specialists.

    So now tablets, or slates as Ballmer called them in his CES speech, are once again a "much-hyped category of electronics." The difference is that this time, thanks to Apple and the iPhone, tablets are to have multi-touch.

    Let's hope all this works. Technology has come a very long way since those early days of tablet computers, but hype is never good if it's based on a flood of me-too products of a concept that has yet to prove it can work.

    For an illustrated history of tablets and slates, see excerpts of "The Past and Future of Pen Computing" by RuggedPCReview.com editor Conrad H. Blickenstorfer, presented as a keynote address at the Taipei International Convention Center in December of 2001.

    Posted by conradb212 at 4:37 PM

    January 4, 2010

    Getac now offers 5-year warranties!

    Sometimes the most amazing news is not a product announcement. That's what I thought when I saw Getac's press release about offering 5-year "bumper-to-bumper" warranties for all their rugged notebook computers. That's a long time.

    According to Getac, the new warranty covers all of their fully rugged computers (i.e. the A790, B300, E100, M230 and V100 models) delivered on or after January first of this year. And the warranty includes "damage that occurs due to accidental acts and exposure to environmental conditions". According to Getac president Jim Rimay, they did that because in these tough economic times, computers are more likely replaced on a 5-year cycle instead of the 3-year upgrading cycle of more prosperous times. By offering a full 5-year warranty, customers will not incur additional service/warranty fees if they keep their equipment longer. The 5-year warranty is also a welcome change, the press release says, to governments and other large entities where getting approval for equipment repair can be a lengthy and involved process (it can, I've been there).

    Five years is a long time, and especially so for a product that is designed to be used outdoors and under demanding environmental conditions where it is much more likely that computers are dropped, bumped around, rained on, and just generally experience conditions far from those in a nice, warm, clean office. It'd be interesting to know the actual mechanics of the warranty, what all is included, if certain items are excluded, what the turn-around is, shipment costs and so on. I am sure Getac thought this through, and we'll put in an inquiry to the folks at Getac.

    How important are warranties and service in this field? Extremely so. I've personally visited the service and repair facilities of the leaders in the rugged computer market and came away more than impressed. Unlike in the commercial market where service is often hit-or-miss, with rugged systems failure rates, failure statistics and service turn-around times are meticulously recorded and managed. That's because with rugged systems, total cost of ownership matters and a good reputation for service and a good warranty definitely represent a strategic advantage.

    Getac is on to something here, and offering a 5-year warranty definitely offers significant value-added to their products.

    Posted by conradb212 at 4:18 PM

    December 22, 2009

    New Atom processors: N450, D410 and D510

    On December 21, 2009, Intel announced the next generation of Atom processors. The new generation of Atom processors includes the single core N450, the single core D410 and the dual-core D510.

    Up to this announcement, millions of netbooks (as well as related devices such as tablets and boards) used the Atom N270 processor with its two companion chips, the ICH7M I/O chip and the 945GSE graphics and memory controller. The combo of the latter two is known as the Intel 945GSE Express chipset and makes for a total of three chips. Of N-Series processors released prior to this latest announcement, the Atom N280 was really just a very slightly faster N270 (1.66GHz vs 1.6GHz), and the Atom 330 (technically not N-series, but still in the "Diamondville" family as opposed to the more industrial "Silverthorne" Z-series Atoms) a dual-core version of the desktop-oriented Atom 230.

    With the new chips, the big news is that Intel reduced the chip count from three to two by integrating the graphics and memory controller into the CPU itself. The old ICH7M I/O controller chip is replaced with the Intel NM10 Express. This means fewer chips to mount, lower power consumption, and, not mentioned, one less reason to seek third party chipsets (such as NVIDIA's Ion Graphics Processors).

    Of the three new processors, the N450 is specifically geared towards netbooks whereas the D410 and D510 processors, all working in conjunction with the new NM10 I/O controller, are geared towards low-end desktops. The new NM10 I/O controller consumes just two watts compared to the older southbridge ICH7M's 3.3 watts. More amazingly, while the old GMCH display and memory controller with its 945GSE northbridge chip with GMA950 graphics consumed six watts, the Graphics Media Accelerator 3150-based integrated solution only adds about three watts to the consumption of the netbook-oriented N450 (chip max TDP (thermal design power, a measure of power consumption) 5.5 watts vs 2.5 watts of the N270 w/o graphics).

    From what I can tell, the GMA3150 has hardware acceleration for MPEG-2 but not for H.264, so there's still no HD hardware decoding, which means a third-party HD decoder chip will come in handy. Onboard video is now likely to move from 17 : 10 aspect ratio 1024 x 600 pixel to a somewhat more palatable 1366 x 768 pixel, with significantly higher (2048 x 1536) external analog video possible (though some reports say that the N-Series chip is limited to 1400 x 1050, which would be less than what we have now). Somewhat surprisingly for a new chip, memory support is for DDR2 instead of the newer DDR3 standard.

    Transistor count goes from the N270's 47 million to 225 million in the new single core models and 317 in the new dual-core chip, which means the CPU alone goes from 47 to 92 million transistors, with the graphics and memory controllers using about 133 million transistors. What exactly the extra 45 million transistors do is not clear as the tech specs look pretty much the same.

    Note that Intel targets the D410 and D510 processors specifically for desktops. Though the D410 has the same clockspeed and uses the same NM10 I/O controller, it max TDP is almost twice that of the N450, 10 watts versus just 5.5. That's likely due to the graphics core running at twice the speed in D-series chips (400 vs 200MHz).

    Overall, it doesn't look like the new Atoms, which have the Intel 64 extensions, will bring much of a performance improvement to netbooks and netbook-level rugged or embedded devices. Reducing the chip count from three to two is nice, but the Z-series processors already had that. Graphics seem somewhat improved, but not enough to make a huge difference, and there's still no HD playback hardware support. I am also not quite sure why the D410 and D510 processors are aimed at the desktop when the D410 chip combo has a total system TDP that's the same as that of the N270 and N280 (12 vs 11.8 watts), and the dual-core D510 just a bit more (15 vs. 11.8 watts). Also interesting is that Intel highlights the smaller footprint when it was a larger footprint that was lauded at the introduction of the "large package" P and PW series of industrial processors just a bit ago.

    Overall, it's good to see these new Atom chips although I can't help but feeling that Intel looked out for itself more than adding compelling value for consumers.

    Here is Intel's list of the entire Atom processor family.

    Posted by conradb212 at 5:47 PM

    December 18, 2009

    The Atom processor predicament

    Well, this is going to be interesting. Despite the Intel Atom chips' modest performance, consumers have bought millions and millions of those little netbooks. I am quite certain they bought them because of the low price that made netbooks an impulse buy as opposed to spending more for a "real" notebook computer.

    Whether or not customers are happy with their netbooks largely depends on how they use the computers. The small display with 1024 x 600 pixel resolution is confining for almost any real work as there's just not enough real estate. And while the term "netbook" implies that the devices are especially well suited for accessing the web and browsing around, that really isn't true. Netbooks are generally sluggish browsers and mostly unable to deliver adequate multimedia performance. And those who hoped to run HD video on their netbooks struck out completely, because first-gen netbooks simply couldn't do that at an acceptable pace.

    On the other hand, the netbooks' small size and weight made them wonderful travel companions, and with an extended battery they practically ran forever on a charge (well, six hours or more in the case of my Acer Aspire One). And when hooked up to a big screen and a full-size keyboard, netbooks work really well as office computers. I hook up my little Acer to a 1680 x 1050 pixel 22-inch wide-screen.

    However, we always want more, and so netbooks have been creeping up in size and power. Display size went from 7 to 8.9 inches, then 10.1 and now 12.1 inches. Which means netbooks are morphing ever closer to standard notebook range, which also means customers will continue to want and expect more. I mean, if the netbooks are so large now, why not an optical drive, and could we have the screen just a bit larger yet? Obviously, what customers really want is a device that costs as little as a netbook, but is as large and powerful as notebooks were before they became hefty giants with 19-inch ultra-wide-format displays.

    Problem is, the Atom N270 simply isn't up to powering anything more than a little netbook, and even that just marginally. So Intel released the very slightly more powerful N280 and the dual-core N330. And NVIDIA came up with the NVIDIA Ion Graphics chipset that is supposed to work better with Atom N-Series chips than Intel's own chipset. I recently read a review of the Asus Eee PC 1201N netbook that uses both the N330 chip and the NVIDIA chipset, has a 1366 x 768 12.1-inch screen and lists for US$499. According to the review, you can now actually watch HD video, play many games, and things feel quite a bit less sluggish. Battery life is less than it was for the older, smaller netbooks, of course, and for 500 bucks you can easily get a "real" notebook with far higher performance and many more features.

    Why do I bring all this up? Because the rugged market has also heavily invested in Atom technology and almost everyone has Atom devices in their lineup or pipeline. Almost all of them are based on either the Atom N270 or the Z510/530/540, i.e. the first generation of Atoms, the minimal ones with "targeted" performance. And now, just as we're starting to see nicely optimized Atom systems that live up to battery life expectations, some of those initial chips are already going to be replaced by the N280, N330 and soon by next gen Atom chips. That's bad news for rugged manufacturers whose first-gen Atom products are just now becoming available.

    The moral of the Atom story is, at least for vertical market manufacturers: pick an Atom chip that Intel is likely to support for several years, and make certain the drivers are fully optimized and all the power saving features are fully implemented. Atom can deliver superior battery life and acceptable performance, but manufacturers must carefully target those products so customers won't be disappointed. We've seen Atom-based machines that use hardly less battery power than devices with much more powerful processors. That won't do. And we've seen some where non-optimized graphics drivers made the machines painful to use.

    Using an automotive analogy, with the Atom Intel created a small and miserly 4-cylinder engine for use in fuel-efficient vehicles that provide adequate performance as long as the car isn't too big and heavy and customers have not been led to have unrealistic expectations. With the new and upcoming Atom chips, Intel is already making bigger, more powerful engines, obsoleting the earlier ones and giving in to the demand for more horsepower at the expense of efficiency and good design.

    Posted by conradb212 at 1:41 AM

    October 29, 2009

    Apple stores supposedly transitioning from WinMo to iPod Touch

    Anyone who's ever been to an Apple store for an appointment or service knows the weird procedure where someone greets you at the door, takes your info, and then wirelessly sends it to some other Apple people who then come greet you when it's your time. Same for making payments away from the main desk and so on. It all works, but it's a bit odd, and even weirder is that some of that mobile check-in and checkout is done on non-Apple hardware (Symbol, actually) that's running Windows CE software. Supposedly it was done that way because Apple mobile gear couldn't handle bar codes and credit cards and such.

    I always thought that was strange because there are all sorts of scanning and credit card processing apps available for the iPhone. And, in typical iPhone fashion, they are being used in cool, innovative ways. For example, there's an app ("Red Laser") that scans a barcode and then instantly checks the Web for the best prices for that product. That way you always know whether you're getting a good deal. There are also numerous apps for credit card processing. That should not come as a surprise in an era where banks are starting to allow you to remotely "deposit" checks from an iPhone.

    Anyway, the folks at ifoapplestore.com now report that Apple stores may be transitioning to iPod Touches with an advanced scanner accessory and point-of-sale POS software for checkout. Other businesses are probably following in their path. And I can easily see iPhones and iPods being used in more industrial applications thanks to all those ruggedized cases available now (my favorite one is the Otterbox Defender). Can iPhone-based industrial-strength vertical market apps be far behind?

    Posted by conradb212 at 6:36 PM

    October 23, 2009

    Windows 7

    Well, the much advertised public release date of Windows 7 has come and gone. The equivalent of "War and Peace" has been written on how wonderful it is and on how Microsoft "got it right" this time. Maybe they have and maybe they haven't. Here at RuggedPCReview.com, we've used Windows 7 on some of the rugged hardware we've had here for testing and evaluation recently and, frankly, it looked so much like Vista that we barely noticed anything was different.

    At this point, I have mixed feelings. Almost all the rugged hardware that comes in here still runs Windows XP or the Tablet PC Edition or, increasingly, one of the embedded versions of Windows. It was actually interesting to see all those "XYZ recommends Vista" tag lines on manufacturers' websites and promotional materials when most of their machines really still ran XP.

    So now Windows 7 is here, and Microsoft has been quite successful in creating the buzz that it's new and leaner and faster than Vista. Some of the industry pundits were practically falling all over themselves heaping praise upon Microsoft, so much so that it was almost embarrassing. Steve Wildstrom at Business Week, whose straightforward opinions I greatly respect, was quite critical over the unacceptable upgrade from XP to Windows 7 (reinstall every app from scratch) and how long the upgrade takes, but he also then said Windows 7 was "something truly better."

    I think whether or not Windows 7 is indeed something truly better will eventually determine the fate of Windows 7. It looks so much like Vista that had it not been for Vista's questionable reputation, Microsoft probably would have simply called the "new" OS Vista Service Pack 3. As is, that wasn't an option. From a PR standpoint, Vista was so damaged that almost anything would look better. So creating something that is not as bad as Vista is like General Motors improving the Corvair back in the 1960s. It really was a pretty good car in the end, but Ralph Nader's "Unsafe at Any Speed" had damaged the Corvair beyond repair. So from that point of view, having Windows 7 look like Vista and simply saying it's better than Vista may not have been a great idea.

    But let's assume that Windows 7 is better than Vista and that Microsoft really has learned and listened. Then you still have the problem that a good number of users will have to upgrade from XP to Windows 7, which so happens to be perhaps Windows 7's most frustrating point. That particularly applies to corporate users where many shops never migrated to Vista at all. It's conceivable that Windows 7, Vista-like though it is, may indeed cause a lot of companies to finally make the migration from XP, but that may mostly be because by now XP is two generations out of date and Microsoft very actively discourages the use of XP.

    Only time will tell. It seems almost unthinkable that the world will wholesale reject another Microsoft OS the way Vista as rejected. I mean, a company cannot continue to have 90+% of the market when its new products are rejected. This is why Windows 7 is hugely important to Microsoft. If it's another failure, and the coming weeks and months will tell whether the media enthusiasm will give way to user frustration or not, then, Redmond, we have a problem. If the Vista flop is forgiven like Windows ME was eventually forgiven, Ballmer & Co will likely breathe a huge sigh of relief.

    Does it all matter in the rugged space? Not as much as it matters in the consumer and commercial markets. The major players will make sure their product lines are able to run Windows 7 well. And an increasing number may look to Windows Embedded, now that it's called Windows Embedded Standard and "XP" has been banished from the name, though for now it's still really XP (Windows Embedded Standard 2011 will be Windows 7-based).

    As expected, Apple is having a field day with the Windows 7 release, running one funny "I'm a PC and I'm a Mac" commercial after another. And just as many would love to have iPhone ease-of-use and functionality on their industrial handhelds, many wish the Mac OS were available on rugged machines. But it's not, and so we truly hope that Windows 7 will give the world a productive and reliable computing platform to work on.

    Posted by conradb212 at 7:37 PM

    October 7, 2009

    Getac to offer multi-touch on its V100 rugged Tablet PC

    Multi-touch has been all the rage ever since Apple showed the world the effortless elegance and utility of the iPhone's two-finger pinch and spread to zoom in and out. So what is multi-touch? Basically, it means the touch screen is able to accept simultaneous input from more than one position. While on the iPhone, multi-touch is currently limited to two fingers, there is theoretically no limit as to the number of simultaneous touches.

    What is multi-touch good for? Well, Apple's super-elegant zooming certainly go everyone's attention, but multi-touch can also be used for things like rotating with a two-finger screw in or screw out motion. In addition, multi-touch can be used gestures and the functionality can be built into vertical market custom applications.

    While Apple iPhone achieves its multi-touch capability with projected capacitive touch screen technology, that wouldn't work very well in industrial applications where users often wear gloves. For those applications you need a more traditional resistive (pressure-sensitive) touch screen.

    There are currently a number of companies working on providing resistive multi-touch systems. Among them are Stantum, Touchco, SiMa Systems, and several others. Some of these products are in the development stage, others are currently available, and each technology is targeted at certain types of applications.

    On October 7, 2009, Getac announced that its V100 rugged Tablet PC will offer a multi-touch screen that can be used with or without gloves. According to Getac's press release, this marks a first for rugged computers, and the multi-touch feature will enable users to rotate maps and pictures, zoom in and out of manuals and other documents, move and edit, navigate, and employ a series of special gestures that go beyond what is possible with traditional touch screens that only recognize a single touch.

    While the technology used by Getac wasn't mentioned in the press release materials, Getac added an explanatory page to its website (see here). Getac resellers and developers will certainly have an interesting tool to work with.

    Posted by conradb212 at 5:54 PM

    Gorilla Glass -- lighter and tougher display protection

    On October 6, 2009, Motion Computing announced that their C5 and F5 were the first Tablet PCs to use Corning's Gorilla Glass. What is Gorilla Glass? In its press release, Motion states that it is "thin-sheet glass that was designed to protect against real-world events that cause display damage."

    To learn more I scheduled a call with Corning's Dr. Nagaraja Shashidhar. To prepare myself I checked Corning's very informative page on Gorilla Glass. They have some videos there that show the glass being bent and steel balls falling onto it. The glass neither shatters nor breaks. In fact, it's hard to believe it's glass at all. It looks more like a very thin sheet of some polycarbonate plastic or acrylic. But it is glass.

    The secret, according to Dr. Shashidhar, lies in a special chemical ion-exchange strengthening process that results in what Corning calls a "compression layer" on the surface of the glass. The primary purpose of that layer is to act as an armor that guards against the nicks and tiny cracks that then result in the glass breaking. And even if there are tiny nicks, the layer keeps them from propagating.

    What's amazing is just how thin the glass is. Corning makes it in thicknesses ranging from 0.5mm to 2mm, or 1/50th to 1/12th of an inch. The Gorilla Glass used in the Motion tablets is just 1.2mm thick, yet it provides the protection of a much thicker layer of protective glass at a fraction of the weight. And a thinner layer of protective glass doesn't only mean less weight, it also makes for a more natural feel when using the tablet. With thick glass it sometimes looks like the tip of the pen hovers far above the actual screen. That's not the case with the Gorilla Glass-equipped Motion tablets.

    I had actually had some face time with a Motion F5 tablet with the new glass before Motion announced it. I took the opportunity to not only examine the new display, but also benchmark performance and battery life with the new and more powerful processor Motion now uses for the C5 and F5. I also did side-by-side comparisons between an original Motion F5 and the latest model (see full report).

    I must admit that it's a bit hard to figure out all the F5's display technologies. You start with a Hydis display that now has AFFS+ technology for not only a totally perfect viewing angle in all directions, but also superior brightness. You then add the Gorilla Glass cover that significantly increases the durability of the display. On top of it all is Motion's View Anywhere, which is an anti-reflective sputtered coating on the front side of the glass that is optically bonded to the display.

    How does it work? Extremely well. Between the super-wide viewing angle (which makes for an unbelievably "stable" display) and the excellent sunlight viewability, this is a machine that you can really use outdoors. The Gorilla Glass adds peace of mind (no, I didn't try to break it). And the Gorilla Glass also has another benefit that may turn out to be quite a selling point for Motion: it's nearly immune to smudges. There's nothing worse than a display that's full of grime and fingerprints, and that just doesn't seem to be an issue with Gorilla Glass.

    So there. It's a funny name, Gorilla Glass, but it's definitely a good thing. And I am not surprised that Motion is the first to have it on a tablet. They always seem to adopt new stuff first.

    Posted by conradb212 at 2:47 AM

    September 10, 2009

    Gotcha, fool! Your friends at AT&T

    The other day we tested a rugged handheld in the RuggedPCReview.com lab. The device so happened to have a SIM slot because it also worked as a phone and a WWLAN data communicator. I so happen to have an unused phone with a SIM in it, and so I decided to use that SIM for testing the rugged handheld. Why do I have an unused phone? Because it's on one of the AT&T's 2-year service contracts. It's just a crappy throw-away phone, but thanks to AT&T I am now paying for it for another year whether I am using it or not.

    So I stick that SIM into the review handheld, make three local calls and load a couple of pages of the RuggedPCReview.com website. Works fine. Take the card out and return it into the unused AT&T phone.

    So then I get the bill. That'll be $14.83 for 1,483kb, i.e. loading one or two large webpages. Thank you very much, AT&T. This kind of highway robbery is precisely why I have completely stopped making any call that I am not certain is covered in my "plan." I am not even calling my mom anymore because I have no clue what outrageous amount AT&T may charge me for a call to Europe.

    But wait, there's more.

    I was on vacation in the Caribbeans for a week. I took my iPhone with me, not because I was going to make a call (heavens no, not with AT&T in an unknown situation!!!), but because the iPhone is a little computer/camera/vidcam/PDA that I take everywhere. Well, apparently six people called my phone while it was in the Caribbeans. I never answered. "That'll be a buck 99 for each call, fool. Haha. Gotcha again. - Your friends at AT&T."

    And there AT&T and the other telcos wonder why we loathe them so much.

    With voice/data increasingly integrated into rugged handhelds and notebooks, be very careful. That SIM in your machine has "Sucker!!!" written all over it.

    Posted by conradb212 at 9:30 PM

    July 30, 2009

    Deal killers: The Telco 2-year contracts

    Years ago, when some exciting new piece of technology came along I simply could not resist buying it. When the first Newton came out I plunked down seven hundred bucks, just to see how it worked and because I simply had to have one. Likewise when Compaq released the Concerto Tablet PC in the mid-1990s. And when that same Compaq came out with its first iPAQs. I bought one.

    You can't do that anymore these days. That's because virtually every piece of technology now includes a phone, and in order to get service you have to sign up for a 2-year contract with the telephone company. Not gonna happen. If I could pick and choose service or just try out a service, I'd probably have a Palm Pre by now, and each of my notebooks and tablets would probably have a wireless card in it. As is, I'd have to sign up for 2-year contracts for each of those devices. Not gonna happen, ever.

    So instead of having a Palm Pre and being able to tell friends and anyone out there interested in reading my blogs and articles on what I think about it, I couldn't care less. Am I going to sign with Sprint just to get a Palm Pre? Not gonna happen. Sprint is the company who sent me to collection three times after I cancelled a fully paid and expired contract. Am I going to sign with Verizon or anyone else for TWO YEARS just to get wireless in my notebook? Not gonna happen. Ever.

    I know, enough people sign those obnoxious contracts because they see no other option. For those of us who love technology and always had the latest and greatest to write about and take wherever we went, we don't do that anymore. We can't. The telcos' greed has killed it all.

    Posted by conradb212 at 10:53 PM

    July 13, 2009

    The dangers of product photography

    While most of the press either uses official product photography supplied by PR agencies or press centers, or takes quickie snapshots with their smartphones, we here at RuggedPCReview.com do it the hard way. We do our own product photography and always make sure that the devices are shown in the environment they are most likely going to be used in. That isn't always easy.

    I was reminded of that as we recently needed to do product photography on a good half dozen of rugged machines. These were rugged and ultra-rugged computers designed to be used on forklifts, in trucks, on bulldozers and other such heavy duty equipment. Well, it so happened that there was a significant construction site nearby where a large number of utility company trucks, dozers, graders and lifts were prepping a parcel of land for who-knows-what. Construction hadn't really started yet, and so the property wasn't fenced in, and all that heavy-duty machinery was just a perfect prop for the product photography I wanted.

    So I filled the back of my car with rugged computers, seven in all, and headed for the construction site. For a couple of hours, Carol, our intrepid product photographer, posed the machines on bulldozers, trucks and all sorts of heavy equipment, taking a couple hundred great shots. But we were also sweating bullets as all of a sudden it occurred to us that law enforcement might show up and inquire as to what, exactly, we were doing and where, exactly, all those computers were coming from. The rugged tablets, panels and notebooks we photographed looked like they belonged in the trucks we took pictures of much more than they looked like they belonged to us.

    As it turned out, while a few police vehicles drove by, no one stopped and asked what we were doing. And so we didn't have to explain why we were carrying about US$25,000 worth of rugged computers from a construction site into the back of our car. Obviously, we could have explained, but it might have taken an hour or two and perhaps a trip downtown in the back of a police cruiser.

    Posted by conradb212 at 9:14 PM

    June 30, 2009

    Where rugged computers come from

    Where do rugged computers come from? Not always where you think. In an increasingly global marketplace the old business model of companies designing, making, selling and servicing their products is increasingly going by the wayside. These days, it's more likely that one company thinks of a product, hires another to design it, has it built by a third, a forth one is marketing and selling it, and a fifth one does the service. As a result, it's becoming pretty difficult to figure out who does what, and where the computers we buy and use are actually coming from.

    For us here at RuggedPCReview.com, this global marketplace often means a good deal of detective work when trying to figure out who actually makes a machine. You could argue that a computer is a computer and it's not really important who designed and manufactured it. That may be so for some, but I really like to know who did the design, who specified the features, and where manufacturing took place. It'd be silly to praise a company for their excellent design when, in fact, all they did was strike a deal with a Chinese manufacturer and put their label on the machine. There's nothing wrong with that, and many companies do a great job searching for good products that they then sell and service in the US. But it'd still be good to know the actual origin and background of a machine.

    What are some of the different business models?

    • There are resellers that sell machines from other companies.

    • There are distributors which carry machines from a variety of sources and often put their own names on the machines.

    • There are vendors and system integrators that sell value-added third party machines under their own name. They may or may not have exclusive arrangements with their supplies.

    • There are companies that have their own engineering resources and jointly develop machines with Taiwanese or Chinese manufacturers.

    • There are companies that design their own machines, but have them built by a Taiwanese or Chinese contract manufacturer.

    • And finally, there are those who still design and manufacture their own machines.

    However, it doesn't end there. Some of the Asian manufacturers have their own relationships and interconnections. As a result, we've seen machines where the top part came from one Asian company and the bottom part from another. We've seen machines seemingly made by Taiwanese manufacturers also being marketed by Chinese companies, apparently under reseller agreements (by and large we assume that machines are made in countries with lower manufacturing costs and marketed or re-sold in countries with higher costs). It can get really confusing.

    There are also an awful lot of vendors out there, some of which we never heard from. This morning, for example, I came across Chinese Evoc Group, which has been around since 1993 and makes a large variety of rugged, embedded and industrial computers and components, including some interesting looking panel PCs and rugged notebooks (check the Evoc JNB-1404 and Evoc JNB-1502 rugged notebooks).

    Does it even matter where all those computers come from? Probably not to consumers. Whether the Dell or HP notebook at OfficeMax is actually made by Quanta or by Wistron hardly matters (though it really concerns me that apart from CPUs, some other chips and software, almost nothing is made in the US anymore). All those Taiwanese OEMs are top notch, and an increasing number of the Chinese ones as well. It does matter to us, though.

    Knowing, and reporting on, all those lesser known Asian OEMs means finding the hidden gems, the companies whose products we'd love to see on the US market. Covering them may lead to OEM deals with US and European companies, and such relationships can be win-win arrangements for all involved. Our feedback may also help them adjust their products for the US and other Western markets that often have different values, priorities and expectations. In that sense, I hope that we at RuggedPCReview.com can be a clearinghouse and conduit of information.

    Posted by conradb212 at 7:28 PM

    June 12, 2009

    Palm and Windows Mobile and how the iPhone really changed everything

    With all the hoopla over the much anticipated release of the Palm Pre in early June of 2009, I thought about the ever-changing fortunes of the mobile platforms in our industry.

    Disregarding some smaller players and initiatives, here's the big picture: In 1993, the Apple Newton made news when then Apple CEO John Sculley pushed it hard and predicted that such devices and their infrastructure would one day be a trillion dollar industry. Sculley was scorned for that remark, as was the Newton for its various shortcomings. But the Newton, way ahead of its time, was still good enough to get Microsoft to respond with its own mobile platform, just as a few years prior Microsoft had responded when pen computing with its PenPoint operating system threatened to compete with Windows.

    So Windows CE was introduced in 1996, together with a lineup of little clamshells handhelds. The same year, Palm Computing released the little Palm Pilot that no one thought was going to be successful because it neither had a keyboard (considered mandatory after the Newton handwriting recognition fiasco) nor an expansion slot. But much to everyone's surprise, the Palm Pilot took off while Windows CE devices quickly garnered a reputation for being clumsy and underpowered.

    Microsoft's approach was to reluctantly add features and gradually allowing more powerful hardware, always concerned that devices might eat into the much more lucrative low-end notebook market, just as they are now worried about netbooks. Microsoft's hardware partners played along and came up with some amazingly innovative devices (yes, you could get a Windows CE-based "netbook" with a 10-inch display and 800 x 600 resolution ten years ago), but even that didn't work against Palm, which sold handhelds by the millions and adeptly crafted a "Palm economy" and thriving developer community that quickly dwarfed Microsoft's tentative and fragmented efforts.

    At some point, Microsoft had the chutzpah to steal from Palm by trying to launch a handheld platform called the "Palm PC," but Palm's lawyers quickly nixed that, and their ho-hum handheld PC platform went nowhere. In a last ditch attempt, Microsoft nuked its multiple processor architecture approach around the turn of the millennium and tried again with the "Pocket PC," a markedly improved platform that has survived, in almost unchanged form, to this day.

    Palm, in the meantime, thrived and reached a 75% global marketshare. When I gave a keynote presentation at the Taipei International Convention Center in 2001 on the future of pen computing and PDAs, I noted that Palm's OS was aging and Windows CE was gaining market share and might catch Palm within four or five years, but no one really believed that. Yet, it happened in a remarkable, unlikely succession of events that saw Palm fumble its leading position away and sink into virtual irrelevance while Microsoft, hardly more adept with its own mobile efforts repositioned Windows CE as, essentially, an embedded platform for the vertical market.

    That approach, while it made sense, wasn't actually one that I thought was automatically going to be successful. In the late 1990s, Symbol Technologies, now part of Motorola, had been one of the first to adopt non-proprietary operating systems into its products. At some point, they offered both a Palm OS product and a very similar one powered by Windows CE, and at the time we were told that the Palm device did far better. Yet, Symbol was one of the very few vertical market companies that chose Palm, whereas Microsoft was remarkably successful in quietly positioning Windows CE as sort of a low-cost subset of Windows that would leverage corporate IT expertise and investments.

    So while a lot of people wondered why Microsoft couldn't do any better in the mobile space, it was probably because they didn't want to. In 2002 I reviewed the T-Mobile Pocket PC Phone, an early smartphone that was amazingly good and would still fit right into the smartphone landscape of today, both in terms of looks and performance. Yet, not much happened after that. HP pretty much gambled away the "iPAQ" brand that came into its possession when they took over Compaq. Taiwanese and Korean companies became the new driving force, with the likes of HTC and Samsung settings trends and directions. And somehow the notion took hold that every handheld had to be a phone, which, in the US at least, meant being forced into overpriced 2-year contracts with telcos that couldn't care less about anything other than profit.

    The reason why Windows CE became so successful is not because it's so good. It's a nice workmanlike effort, to be sure, but it's clumsy, sluggish and about as agile as a riverboat. But it only took over because a) the proprietary computing platforms of earlier handhelds were no longer acceptable, b) Palm let it by self-destructing, and c) because IT uses Windows and Windows CE sort of fits in. So there. It works, but it's ugly, really ugly.

    It took Apple with the iPhone to demonstrate just how ugly Windows CE was. Unlike the Newton, the iPhone was right from the start, and it totally redefined how a mobile device should work. Its effortless elegance is exactly what people want, and Apple made it look natural and easy. The iPhone is human interface engineering at its very best. It may not meet all the IT-mandated checkmarks (yet) and thus earned the stern finger-wagging from some corporate types, but even they probably have an iPhone in their pockets. Once you know how simply and beautifully things can work, you never want to go back.

    In a sense it's deja-vue all over again. Apple has a better product and a better idea, but Microsoft still dominates the desktop. Palm, back from the pretty-much-dead, tries again with a slick little box, just like the Palm Pilot once was, only this time they're copying Apple. The question in my mind is how long even workers and industrial users are willing to put up with klutzy, clumsy Windows CE now that almost everyone knows how well handheld electronics can work.

    Posted by conradb212 at 4:27 PM

    April 10, 2009

    Atom platform expands, but does it have a clear direction?

    In the days of the 386, 486 and even early Pentium processors, it used to be fairly easy to follow Intel's chips as they mainly differed in clock speed. These days, staying on top of Intel's various offerings has become an almost full time job. That even goes for Intel's low-end Atom chips that, together with resurrecting some older Intel technologies such as hyperthreading, seemed to simplify the matter of processor selection. It didn't really turn out that way. Intel has been very successful in positioning the Atom processor as new, exciting, efficient and just generally the way to go, but it's really not that simple.

    For example, "Atom" has from the start referred to two very different processor families.

    The initial generation of Atom processor was the Z5X0 that was codenamed "Silverthorne" with a tiny 13 x 14 mm package footprint. They were targeted at mobile internet devices (MIDs) and used the also entirely new "Poulsbo" System Controller Hub. The processor has about 47 million transistors, which is more than the Pentium 4 had. Bus frequency is 400 or 533MHz (which support Intel's HyperThreading). Thermal Design Power is between 0.85 watts for a low-end 800MHz version without HyperThreading, and 2.65 watts for a 1.86GHz verison with HyperThreading. The chipset uses about 2.3 watts, which means total CPU and chipset consumption isn't even 5 watts. And the chipset has hardware support for H.264 and other HD decoding. However, as a the combo is targeted for internet devices, there is PATA but no SATA support.

    A second family of Atom processors, the N2X0 that was codenamed "Diamondville," was meant for standard low-cost PCs and netbook type of devices. The N2X0 is similar in many ways to the 5XX platform, but used a somewhat larger 22 x 22 mm package. The N270 has a TDP of 2 watts and costs less than US$44, the same speed N230 4 watts and US$29. As of now, the N2X0 processor generally uses a version of the older i945 chipset. In order to reduce its power consumption down to 5.5 watts, its frequency (and performance) have been lowered as well and the chipset is called the i945GSE. This is used in the N270. The N230 chip, geared towards desktops, uses the i945GC that is quicker, but also uses 18 watts! Note that the i945's GMA 950 IGP is not able to decode HD signals. The N2X0 can be used with SiS chipsets. though I haven't seen any such systems.

    From the looks of it, system designers have been struggling in figuring out whether to use the Z5xx or the N2xx chip. In netbooks it was a slamdunk for Diamondville as almost all netbooks use the 1.6GHz N270. However, there are exceptions. When Panasonic introduced its Toughbook CF-H1 Mobile Clinical Assistant, it came with the 1.86GHz Atom Z540 processor. And when Samwell, one of Taiwan's major OEMs in the semi-rugged and rugged space, introduced what is essentially a rugged tablet version of a netbook, they also picked a "Silverthorne" processor, in this case the Z530P.

    I am not sure what drives the decision to go with a Atom N270 versus a Atom Z530. On the surface, they seem to have about the same performance and use about the same amount of power. One glaring difference in their specification is that the N2XX series supports the ever-important SATA (serial ATA) disk interface whereas the Z5XX does not and needs to use PATA drives. On the other hand, the technically inclined point out that the N2XX's use of a very slow version of the already dated i945 chipset makes for sluggish graphics performance and that the i945's GMA 950 IGP is not able to decode HD signals. Anyone who has tried playing back high-def video on a N270-based netbooks knows the pain. However, both versions of the Atom score about the same on the two benchmark systems we use here at RuggedPCReview (PassMark 6.1 and CrystalMark 2004). The Z5xx, in fact, scored very low in 3D graphics, which one would assume are at least somewhat of importance in any "mobile internet device."

    But things are getting more interesting yet. Despite what on the surface appears to be the more lucrative "Diamondville" market with its many millions of N270 chips, on April 8, 2009, Intel announced the expansion of the Z5xx platform with a new high-end version, the 2GHz Z515, and a new gas miser version, the "up-to-1.2GHz" Z515. At the same time, Intel spoke of an entirely new Atom platform called "Moorestown" that combines the "Lincroft" system-on-chip with the "Langwell" hub of which as of now all I know is that it uses a lot of acronyms and is still based on the 45nm manufacturing technology.

    On the N2xx horizon, there is the N280 processor, and apparently also a dual core Atom chip. There is not much material out there on those, and I need to look more into it.

    There was another development. For embedded computing Intel quietly expanded the Z5XX platform with larger form factor versions that carry a "P" in their name, and then special "large form factor with industrial temperature options" versions marked with a "PT." I was aware that Intel would release a "large package" version of the Atom, but not the timing and the purpose. Well, this happened in March of 2009 when Intel added the "large form factor" Atom 1.1GHz Z510P and 1.6GHz Z530P as well as the "large form factor with industrial temperature option" 1.1GHz Z510PT and 1.33GHz Z520PT. What does that mean? In essence, the P and PT versions look like larger chips. Instead of the tiny 13x14mm package of the original Z5xx chips, they use a 22x22mm package, which is actually the same size as the N2xx chips. As far as temperature range goes, 0 to 70 degrees Celsius (32 to 158 degrees Fahrenheit) is considered "commercial," whereas -40 to 85 degrees Celsius (-40 to 185 degrees Fahrenheit) is considered "industrial." Interestingly, only the "PT" series processors support the industrial temperature range; the "P" series versions are listed with the same commercial temperature range as the initial chips.

    Intel's updated Z5xx product brief now stresses fairly strongly that there are industrial as well as commercal temperature range packages for both the Z5xx processors as well as for their complementing US15W system controller hubs (GMA 500 graphics, I/O controller and memory controller). The brief also stresses that the small footprint versions are for space-constrained handheld and embedded devices whereas the large form factor is pitched for designs without small space restrictions but industrial temperature requirements. So why then do the "P" processors still have the same commercial temperature rating? Probably because the large package also includes "an integrated heat spreader" that "further contributes to its value for thermally constrained, fanless applications." Since the thermal design power of these chips was already tiny, I am not sure what the integrated heat spreader does, or why it was necessary.

    In terms of performance, the "P" large form factor and "PT" large form factor/industrial temperature range chips appear unchanged, though the TDP is up a bit from 2.0 to 2.2 watts. However, if you compare the Intel's summary sheets for the Z530 and the Z530P it looks like the 530P chip is missing Intel Virtualization Technology as well as Demand Based Switching. Virtualization technology, according to Intel, allows "consolidating multiple environments into a single server or PC" which I believe means the CPU acts as if it were multiple CPUs operating independently so you can run different operating systems at the same time. Demand Based Switching was described as an enhanced version of Intel's SpeedStep technology (see description) that is available in both versions of the Z530. These are generally fairly involved server-based issues and I am not sure what the relevance to the new "large package" Atom processors is.

    In any case, the "large package" also has a different "ball pitch," which refers to the spacing of the little balls of solder that replace pins on the underside of these tiny processor packages. From what I can tell, the 0.6mm ball pitch of the original Z5xx series requires high density interconnects (HDI) on the printed circuit boards, and those are more difficult to do and also more finicky--not what one would want in a rugged product (for an example of these issues, read this). So the "P" series would address that issue with its larger package size whereas the "PT" series would appeal to automotive and other transportation and industrial applications that often have a -40 to 185 degrees Fahrenheit requirement.

    Now add to this that Atom chips, despite all the hoopla and market acceptance, are pretty poor performers, benchmarking no better than the lowly original Core Solos. Graphics performance, especially, is weak (what's considered weak in one device can be more than adequate in another, of course). There's the low power consumption, of course, but even that is not a given. We've benchmarked exceedingly thrifty Core 2 Duo machines as well as power-guzzling Atom systems, so proper setup and configuration are an issue.

    Sometimes it almost seems like the Atom is sort of a trial balloon, one where Intel very successfully created an attractive image of a hip processor, but is also somewhat aimlessly trying out various applications to see where the Atom will fit and stick.

    Posted by conradb212 at 2:20 PM

    January 15, 2009

    The Intel Atom processor phenomenon

    Frustrated with the small display and insufficient battery life of your mobile or handheld computer? Is it also too big and just not quick enough? And you can't stand a fan coming on and the thing getting so hot you can barely touch it? Welcome to the world of mobile computing where optimizing mutually exclusive goals is the order of the day. As a result, manufacturers of mobile gear are fighting a never-ending struggle to find the best compromise -- and it is always a compromise -- between size, weight, usability, performance and battery life. The screen should be large enough to be useful. Size and weight should be such as to render the device as mobile as possible. Performance should at least be adequate. And the battery must last long enough to get the job done. Long battery life either means a big battery or a device that doesn't use much power, and the latter is often preferable. Displays use a lot of power, especially with the backlight up high, but you simply need to see what you're doing and so display size may be a given.

    Which gets us to the processor. There was a time when processors cost next to nothing and the mere thought of needing to cool them with a big fan would have been preposterous. When I bought my first IBM PC in 1981, it cost US$4,000, in 1981 dollars. It was powered by a 4.77MHz Intel 8088 processor that you could be at any electronics store for about six dollars (the folks who proclaim that ALL electronics components have become so much cheaper obviously weren't around in 1981). Intel managed to parlay the processor business into a near monopoly, with Microsoft and Intel going lock-step in a mutually advantageous game of creating ever more resource intensive software. Microsoft made Windows bigger and bigger, and Intel delivered the processors needed to run it. That's what got us to a point where software needs minutes to boot, and the processor, chipset and graphics card all need big fans for cooling. Oh, and while the cost of computers has come way down, the cost of Intel processors has gone way, way up. A big new one can cost a thousand dollars, and even more modest ones approach the cost of low-end notebooks. A halfway decent Core 2 Duo costs more than a little Acer Aspire One netbook.

    How can Acer, and everyone else who makes small, inexpensive computers do it? Increasingly by using the Intel Atom processor, which is smaller, uses less power, and costs relatively little. Why did Intel do it? Because they found themselves in a predicament. Microsoft increasingly insists that every computer must run Windows proper. The 1990s experiment with Pocket PCs is essentially over. By insisting that small platforms had to be compatible with Windows, yet making sure they didn't get powerful enough to be a threat to the Windows business, Microsoft successfully kept the wings of mobile devices clipped, to the extent where they eventually disappeared as viable platforms. Just the other day I came across a press release from a major manufacturer of rugged handheld computers that said its customers increasingly demanded full Windows even on handheld devices. And that gets us right back to the Atom processor.

    Now cost isn't as much of a factor in the vertical marketplace as it is in the consumer market. I am not saying cost doesn't matter, but a market where a device may cost US$4,000 has a bit more leeway than one where customers expect US$800 pricing. What does matter, though, is size, weight and battery life. So what Intel did with the Atom processors is essentially remove the processor as a major power consumption factor. What do I mean by that? Well, an average Core 2 Duo desktop processor uses around 65 watts, a mobile version about 35 watts. There are chips that use considerably more or a bit less, but those are the rough numbers.

    Now how do we know how much power a processor uses? After all, Intel sells them using a weird nomenclature that, unlike light bulbs that have a watt rating, seems unrelated to performance. Instead, Intel usually provides what they call the "Thermal Design Power," or TDP. TDP is described as "The maximum amount of heat which a thermal solution must be able to dissipate from the processor so that the processor will operate under normal operating conditions." There's a good deal of debate as to what TDP actually means and how it relates to real world power consumption of a processor. But for the sake of the argument, let's assume we're talking watt-hours and the processor is in a battery-powered computer. We can easily compute the battery's watt-hours by multiplying volt and amp ratings. A powerful notebook computer battery may provide 75 watt-hour, just enough to run a typical desktop processor for an hour (and that's without the power needed for the display and everything else in the notebook). A frugal notebook processor with a TDP of 25 watts would run three hours, and that sounds about right (in the real world, the processor uses power conservation modes most of the time, but you have to add in the power used by all the other computer components).

    Now what does an Atom processor use? Between 0.6 and 4 watts. There are two different families of Atom chips, one geared towards mobile Internet devices (MIDs) and one towards netbooks and other low-cost PCs. The most popular chip in mobile computing is probably the 1.6GHz Atom N270, which has a TDP of 2.5 watts. That's the chip you find in almost all current (early 2009) netbooks and in many embedded components. Why two families? Because MIDs and PCs have different feature requirements. MIDs are usually multimedia-oriented and power consumption is totally crucial because the devices are so small. Netbooks and similar generally rely more on compatibility and standard PC interfaces (like SATA).

    So where do the Atom processors fit in as far as power consumption goes? Well, 2.5 watts is sensationally low compared to just about anything else available. The generally unloved Intel Core Solo uses about 5.5 watts in its ultra-low power version (U1300/1400/1500), the Core 2 Solo (U2100/2200) about the same, the mobile Core 2 Duos between 10 watts (U7500) and 45 watts (Q9100/9300). So the most popular Atom processor uses less than half the power of a Core Solo and only a small fraction of that of the Core 2 Duo chips.

    Now keep in mind that processors need corresponding chipsets, and those use power, too. Intel designed a super-efficient chipset to go with the MID-oriented Z5xx series of Atom chips that was once codenamed Silverthorne. That chipset, the "Poulsbo System Controller Hub," can do high definition video decoding and other neat stuff required in consumer multimedia devices, and it only uses about 2.3 watts. However, it does not support serial ATA and some other essentials, which rules it out for many computing applications. The N2x0 series of Atom chips uses the i945GSE, which is a slowed-down version of an older Intel chipset, the i945. That's good as far as compatibility goes, but there is no high-def decoding and 3D performance is low. The i945GSE uses about 5.5 watts, so overall consumption of the N270 and the chipset is still only about 8 watts, but it's not exactly a state-of-the-art solution.

    How about performance? This is where it gets a bit complicated because overall "performance" of a computer depends not only on the CPU, but also the chipset, the memory, the hard disk or SSD, overall system configuration and -- very important -- the OS platform and software loaded. That said, we run fairly extensive benchmarks on all systems that come to our lab, and so far we've found that an average Atom N270 device scores roughly one third of that of a 2.5GHz Core 2 Duo T9400, about 30% less than that of a 1.2GHz Core Duo U2500, about the same as a 1.2GHz Intel Core Solo U1400, and about 50% better than that of a 1GHz Celeron M 373. So we're talking decent, but certainly not blazing speed.

    As far as architecture goes, the Atom is an interesting mix of old and new technologies. It's definitely state-of-the=art in terms of miniaturization, using Intel's hafnium-based high-k manufacturing. That is a fancy terminology describing the use of different conductor materials to make even tinier transistors possible. The architecture of the chips is less advanced. There's only a single core, though Intel uses the old HyperThreading technology known from as far back as the Pentium 4. There are also advanced new power savings technologies.

    Overall, the Atom is certainly an interesting marketing phenomenon. At this point, everyone is clambering to get onboard the Atom bandwagon, and somehow Intel managed to stay clear of the nuclear power connotation though one would expect that from a name like "Atom." Intel, though, stresses the hafnium-based manufacturing process, and hafnium's primary use is in control rods in nuclear power plants, so that may be the "Atom" connection. In any case, even with the sub-optimal chipset situation, the lack of some features, and only moderate performance, Atom is hot. And in the new Intel world order of massively expensive processors, Atom is cheap, too, with prices of well under US$100 depending on the type and version. I've seen $44 for the N270 mentioned, and about the same for some of the low-end Z5x0 chips plus their Poulsbo chipset. Oh, and if you wonder what the difference is between the N270 and the 230, there is a N270 and a 230 that run at the same speed, the N270 is for mobile applications whereas the 230 uses a bit more power (4 watts) and is used with a considerably more power-hungry version of the of the i945 chipset, making the Atom 230 more suitable for desktop use.

    As usual, there are numerous expert opinions out there, and the overall consensus seems to be that, for now, the Atoms just represent Intel's first step into the small form factor embedded and a MID market that is pretty much dominated by ARM-based designs.

    With Intel's resources and marketing savvy, Atom as a "low power" processor platform may well be here to stay. As is, they are off to an amazingly good start.

    For much more information on the Silverthorne platform, check Intel's Intel Atom processor Z5xx Series.

    Posted by conradb212 at 4:42 PM

    January 5, 2009

    The amazing success of "netbooks"

    These days, "netbooks" get a lot of press. You' think a "netbook" were some sort of miraculous new device, a technological breakthrough that lets you do new and wondrous things. In fact, "netbooks" are nothing more than little notebooks. There is absolutely nothing new or exciting about them. And there is nothing that makes them earn the "netbook" name.

    Nor are they new. There have been numerous attempts at selling downsized miniature laptops over the years, going back to the early 1990s and before. None were ever successful. People simply did not want an underpowered mini version of a notebook with a small screen and a keyboard that was not full size. Apparently that's changed and "netbooks" sell by the millions. Go figure.

    One difference perhaps is that technology has come a long way. Even an underpowered mini notebook can do just about anything anyone would ever need in terms of computing. Standard wordprocessing, scheduling, spreadsheets, presentations, email and internet access tasks can all be done on a mini notebook. Let's take a look at what "netbooks" offer:

    For the most part they are clamshells measuring about 10 x 6.5 inches and weighing between two and three pounds. They have displays measuring between 7 and 10 inches diagonally and they usually offer WSVGA resolution, which means 1024 x 600 pixels. Their keyboards are usually around 90%-scale, which is infuriating because that makes touch-typing a pain and also because there'd actually be enough room for a full QWERTY layout by making punctuation keys smaller, but apparently Taiwanese and Chinese ODMs and OEMS do not realize that. Memory is usually limited to a gigabyte, though some can be expanded to a gig and a half. Storage is either via Flash for Linux-based netbooks or generously-sized hard disk for Windows-based units. Most come with a rudimentary onboard cam, SD card or multi-card slots and, of course, Bluetooth and WiFi. And most are powered by Atom chips, generally the 1.6GHz N270.


    How do they work? It depends on your expectations. Benchmark performance is about a third of that of a modern notebook, so routine stuff can take much longer than you're used to. The biggest limitation is the small screen. My Acer Aspire One, one of the most popular netbooks, has a 8.9-inch screen which is bright and sharp, but 1024 x 600 pixels simply isn't enough for anything these days. Working with it becomes a continuous for screen real estate, which means turning off unneeded toolbars and a lot of scrolling, scrolling, scrolling. The term "netbook" is also a total misnomer as the one thing where the current generation of netbooks falls way behind is fast web access. Pages take forever to load.

    If they are such a pain to use, why do I have a netbook? Because they have a lot going for themselves, too. My Acer One runs Windows XP speedily on 1.5GB of RAM, and the 160GB hard disk is both quick and large enough. With its 6-cell battery the little Acer can run as long as six hours on a charge, and sometimes more. I like its dual SD card slots. I occasionally miss an optical drive, but have my office network set up so I can access the DVD drive of a desktop. Most of all, I like the Acer's small and handy size. Packing and transporting even a compact notebook is usually a pain, but the little Acer netbook fits absolutely anywhere. Even its power supply is tiny. In my office, I hook it up to a 20-inch LCD and a full-size keyboard and mouse. I get full 1600 x 1200 pixel resolution, which makes working on the little Acer feel like working on a "real" computer.

    So, "netbooks" they are not. But there does seem to be a good-size niche for surprisingly competent little notebooks that go for less than US$400. Price is definitely an issue. I'd rather have a more rugged device with a touch screen. Fujitsu and Panasonic and others make them, but for several times the money. Why not a rugged netbook with a very small price? It might sell in large quantities.

    Posted by conradb212 at 5:29 PM

    December 19, 2008

    The problem with Linux

    On the surface, Linux should be a huge winner, and in many respects it is. Hey, what more can one want than a free operating system with mostly free software that runs on just about anything? I've been using Linux for many years for just that reason. Free. No hassles with activation, copy protection, and other pesky schemes meant to keep pirates away yet only inconveniencing customers.

    So why hasn't Linux taken over? Because it's too complex. Sure, there are distributions that install simply and easily, but you can also spend hours trying to get one little thing to work right. Linux is a giant patchwork of code from all over the world. Perhaps the biggest challenge is that almost all Linux developers think Linux is so simple that absolutely everyone should be able to perform arcane steps and procedures.

    Linux suffers from the expert syndrome. The expert syndrome is what makes academics speak in nearly incomprehensible language. It makes them look and sound important and, in their minds, is a reflection of their superior intellect and knowledge. Coders, likewise, revel in acting as if their most complex systems were child's play and anyone who does not master it must be an idiot. Some of the instructions for Linux are so complex and incomplete as to make it impossible for anyone who does not already know the systems to install things or make them work.

    In all my time of working with Linux I've found perhaps a handful of truly useful tutorials and instructions. Sadly, this pits an incredibly productive global community of Linux coders and developers squarely at odds with the rest of humanity who can no more compile a kernel than split an atom. The rest of humanity also does not appreciate being talked down to when it comes to doing simple things like properly extracting a file, making a wireless connection work, or numerous other things that should be simple and self-explanatory but, in Linux, are not.

    Unfortunately, I do not see a solution to this problem. You either have tightly controlled empires like Microsoft or Apple where things are centrally controlled and packaged, or you have loosely knit global communities of techies with all their human brilliance and flaws. So things will likely continue the way they have for decades, with Linux being both a a terrific solution but also one that can be endlessly frustrating.

    Posted by conradb212 at 4:32 PM

    November 21, 2008

    Smartphone & Pocket PC Magazine -- the shortsightedness of letting an incredible resource die

    With Microsoft sitting on billions of dollars in cash and spending many millions on comedian Jerry Seinfeld and a silly Vista campaign, the one magazine that has covered Pocket PCs and Windows Mobile for many years has just died due to lack of support. I am talking about Smartphone & Pocket PC Magazine, published by Thaddeus Computing Inc. Those guys were publishing magazines on small Microsoft-powered computers for almost a quarter of a century, yet neither Microsoft nor Hewlett Packard apparently cares enough about real, quality coverage of their products to at least use this incredible magazine as a venue for advertising, let alone as the important, invaluable partner in spreading the word about handheld computers that they are (and now were).

    Having founded and run several print magazines myself, I know all about the work and hardship that goes into creating a quality magazine, and how things are all different in this age and day of the Internet and web. Advertising dollars are increasingly going away from print, and people no longer want to wait for information to appear in print. Everything is available instantly. Yet, the information on the web is .... different. In a way it almost does not compete with print. How else would one explain the fact that there appear to be more magazines on newsstands than ever? I myself absolutely cannot imagine life without computers and the web, yet I have a good dozen print magazine subscriptions that I never intend to give up. Magazines and the web are as different as radio and TV -- both convey information and entertain, but in different ways. Unfortunately, tech companies like Microsoft do not seem to understand that, and the phone companies who have taken over the smartphone business are clueless about the market that has fallen into their laps.

    Fact is, online is becoming much like TV -- far too many channels and nothing to watch. It's all commercials and infomercials. You have to channel-flip not because you can, but because you're constantly avoiding commercials and seeking something, anything, meaningful to watch. And quality is getting lost in a vast sea of drivel. You can google a particular product and instantly get 10,000 references to it, mostly junk. By now the web is jam-packed with virtually content-free sites that are just landing pages for ads and more ads. Even reputable sites are doing it: two paragraphs of content and then commercial bombardment. The ever more popular "customer reviews" are often little more than "this product sucks!", "no, this product is the best ever" slugfests, and the same goes for bulletin boards where there is endless posting and almost no factual information. With the exception of the by now almost suffocating commercialization it's all worth it, of course. But it is NOT a replacement for a good print magazine.

    When I look at the final copy of Smartphone & Pocket PC Magazine (the Smartphone & Pocket PC Super Resource Guide Dec/Jan 2009) I see a hundred pages of superb, comprehensive information, a reference guide I am certain to keep around for years. You'd have to visit literally thousands of websites to get that amount of good information, and even then you would not get the quality. A complete and total spec list of ALL smartphones with touch screens? Check. A complete and total spec list of ALL PDAs? Check. Reviews and ratings of hundreds of the best software apps? Check. A complete analysis of GPS on Windows Mobile, including product reviews and comprehensive comparison charts? Check. Detailed reviews of the leading and upcoming smartphone platforms? Check. And that is just a small part of it. If a consultant were given the task of compiling the huge wealth of information contained in just one issue of Smartphone & Pocket PC Magazine, it'd cost many tens of thousands of dollars, and probably hundreds of thousands. For a company like Microsoft to let such an incredible resource die -- a resource that does nothing but promote Microsoft's mobile embedded platform -- is simply unimaginable. Spending millions on nonsensical commercials and sitting on billions, yet not support real, quality, serious information, it just does not compute. The cost of supporting a resource like Smartphone & Pocket PC Magazine that provides real information is absolutely minuscule compared to the billion here, billion there mentality of big business.

    Lacking any meaningful support from the Windows Mobile side of things, Thaddeus Computing is now going on to cover the iPhone platform with their new iPhone Life magazine. It'll be an uphill battle as now they'll be dealing with one single hardware and software vendor (Apple), one single service provider (AT&T), and application software vendors who do all of their selling through Apple's App Store, so the impact of print advertising will be less traceable than ever. The iPhone is hugely popular, of course, but neither will people buy another iPhone (they're locked into a 2-year contract) nor can they buy another model (there's only one). The phone companies have historically not supported enthusiast magazines and there is no indication they ever will. They also don't "get it," something at least the Microsoft field people certainly did.

    But won't Apple be thrilled to see one of the most respected niche and enthusiast publishers switch allegiance? Likely not, if they even notice. Apple is sitting on its own billions of cash, but I am fairly certain none of it will go to supporting a small magazine that could spread high quality news and real information at an annual cost that's a tiny fraction of the interest on Apple's cash reserves alone. And AT&T, which in the U.S. has a service monopoly on the iPhone? Hah.

    So best of luck to the folks at Thaddeus Computing. It's an absolute crying shame to see Smartphone & Pocket PC Magazine die, and those in the Windows Mobile industry who let that happen deserve to be accused of colossal, inexcusable shortsightedness. Maybe someone will come to their senses and buy Thaddeus. 25 years of experience and commanding knowledge of the major serious mobile platform in the world AND they know how to compile and present information AND they have all the magazine distribution channels in place AND running them for a year probably costs peanuts? No brainer if you ask me.

    Posted by conradb212 at 3:27 PM

    November 18, 2008

    Thoughts about ingress protection: eliminate potential points of failure

    The most commonly used measure for protection against the elements is the IP rating, or Ingress Protection rating. The IP rating consists of two numbers where the first indicates protection against solids and the second protection against liquids. Solid ratings go from 1 to 6, with 6 meaning the best protection. Liquid ratings go from 1 to 8, with 8 meaning the highest protection. Essentially, the purpose of these ratings are the determination of how well a device can keep out dust and water. As far as liquids go, the purpose of the rating is not to signify waterproofing for underwater operation (though IP68 means a device is indeed waterproof) but how well a piece of equipment can keep out water during normal operation in the field. What could happen, for example, is that a device gets exposed to rain, or even strong driving rain during a storm. In a marine setting it is possible for a device to suddenly become exposed to heavy seas, and it may need to be protected against that.

    All of this needs to be tested and certified, and the way it is usually done is by following standard procedures that describe a controlled lab testing setup, like document 60529 issued by the International Electrotechnical Commission (IEC).

    The problem is that lab tests do not always accurately predict what may happen in real life. In that respect the ratings should really be considered guidelines rather than hard data. Consider, for example, two devices that both carry an IP67 rating. One of them has no external ports other than a single surface mount connector used to provide interfacing via a port replicator or dock. The other has a variety of commonly used ports, all protected by individual rubber plugs. One machine may also have an externally accessible expansion slot and an easily replaceable battery, each nicely sealed via o-rings and other high quality seals. Which device do you think is more at risk for leaking?

    I'd say the second as it has multiple areas of entry as opposed to just one. No matter how well engineered the device may be, the probability of something going wrong is higher. A protective cover may not be pushed in all the way. A seal may have shrunk or gotten broken. A door was inadvertantly left open. It can happen.

    A compromised seal may not necessarily mean a leak into the inside of the device. The port itself may carry enough sealing in addition to the protection provided by its cover to ward off damage. Then again, it may not. Bottomline is that the simplest and most foolproof protection is best.

    Anything mission-critical should be failsafe. Failsafe means that if a system fails, it must fail in its safe state. A relay that snaps closed when it loses power is an example. The problem with protective rubber and other seals I'd that none are fail-safe. They are all fail-fail. So the best way to proceed is to have as few potential points of failure as possible.

    What that means is that, all else being equal, a device with fewer possible points of failure will almost always be a better choice as far as protection us concerned.

    Posted by conradb212 at 11:12 PM

    November 10, 2008

    Benchmarking popular mobile Intel processors

    Well, we finally managed to benchmark a mobile device with an Atom processor. Like everyone else, I was wondering where Atom performance fits in. The Thermal Design Power (TDP) of the 45nm Atom processors is so ridiculously low that it's impossible to even make an educated guess. There are, of course, a number of different Atom processors out there, but one that appears to be popular in small mobile devices is the Atom N270.

    The N270 is a single-core processor that runs at 1.6GHz and has a TDP of 2.5 watts -- significantly less than even an ultra-low voltage Intel Core Solo and only a small fraction of the power consumption of your average consumer notebook. There are other system parts that use power, and for now Intel doesn't offer Atom-compatible chipsets that are nearly as miserly as the processor itself. Further, a lot of the advanced features we've come to take for granted in Intel Core processors are simply not part of the Atom. Instead, Intel resorted to the hyper-threading technology from its past. It's all quite complex and it probably takes a chip design experts to tell how various Intel technologies impact performance.

    What we can do is run benchmarks, and that's what we did on an Atom N270-powered Acer Aspire One netbook, an exceedingly handy little clamshell computer with an WXGA 8.9-inch display and a weight of just over two pounds. The tiny Acer came with a gigabyte of RAM, a 160GB 5400rpm disk, and ran Windows XP. Our standard benchmark suite, PassMark, did not complete and so we switched to CrystalMark 2004R2. Here are the results:

    PERFORMANCE COMPARISON Intel A110 Core Solo U1400 Atom N270 Core Duo U2500
    Clock speed 800MHz 1.2GHz 1.6GHz 1.2GHz
    Test Unit GETAC E100 Motion F5 Acer One Xplore 104C4
    Thermal Design Power (TDP) 3.0 watts 5.5 watts 2.5 watts 10.0 watts
    ALU 3026 4565 5544 9291
    FPU 3682 5343 5370 11124
    MEM 2732 4989 4442 6132
    HDD 3614 3252 7900 6381
    GDI 3040 4239 3293 3987
    D2D 2530 4221 2912 3899
    OGL 738 1151 684 1187
    Overall CrystalMark 19362 27760 30145 42001

    These figures suggest that systems equipped with the Atom N270 are quite a bit quicker than machines with the Atom's predecessor chip, the A110, but only a bit faster than the first-gen Intel Core Solo. The 1.6GHz Atom N270 is no match for the 1.2GHz Core Duo U2500 that's used in a number of high-performance Tablet PC slates. The high clock speed of the single core N270 is therefore a bit misleading. Clock cycle for clock cycle, the unloved Core Solo is more powerful.

    However, in a lean, smartly designed system with enough RAM and a speedy disk, such as the Acer One netbook, the N270 can deliver both power and economy. The Acer feels fairly quick, and it runs about 2-1/2 to three hours on a small 24 watt-hour 3-cell battery and 5-1/2 to six hours on a 49 watt-hour 6-cell battery.

    Posted by conradb212 at 9:59 PM

    October 15, 2008

    Ultra-rugged waterproof displays

    In RuggedPCReview we usually cover mobile computers, i.e. systems that combine processing, storage, data input and display all in one unit. That, however, doesn't mean that all mobile systems are all-in-one type of devices. Tablets and slates, for example, are often used in conjunction with an external display and full-size keyboard when used in a stationary environment, and there really is no compelling need for vehicle and panel mount systems to be all-in-one.

    I was reminded of that when I came across some very interesting display products from a company called Digital Systems Engineering, located in Scottsdale, Arizona. They have the DVE Raptor display where DVE stands for "Driver Vision Enhancement." It is a ruggedized LCD display designed to operate under the kind of extreme environmental conditions encountered in tactical wheeled and tracked vehicles. The 10.4-inch SVGA display is sunlight readable with a super-strong 1,000 nits backlight (standard notebooks have less than 200 nits), good vertical an horizontal viewing angles, and zero color shift.

    What's most amazing, though, is the Raptor display's environmental specs. It carries an IP67 rating, which means it is not only totally sealed against dust, but it is also waterproof to the extent where it is submersible. Hopefully that won't happen in a tactical vehicle, but this display will continue to operate under water. It can also operate in an extremely wide temperature range of -40 to 158 degrees Fahrenheit, handle any degree of humidity, and operate at 45,000 feet of altitude. Needless to say, the milled aluminum and heavily sealed and protected display has been shock and vibration tested to MIL-STD-810F specs.

    The screen, which only weighs a bit over eight pounds, is also MIL-STD-3009 compliant. MIL-STD-3009 (also referenced as DOD-STD-3009) sets requirements for aircraft display equipment for use with night vision imaging systems. For mobile computers that generally means they must not interfere with night vision equipment in a cockpit. Part of this document is the U.S. Navy MIL-HDBK-87213 Revision A (Electronically/Optically Generated Airborne Displays) that describes, among other, criteria for legibility of electro-optical display equipment and daylight readability in bright environments, which is a military requirement. This can be an issue with daylight readable displays marketed to the govenment and armed forces.

    If the indestructible Raptor is overkill, Digital Systems Engineering has a line of MSM monitors where MSM stands for Mil Spec Monitor. These come in various display sizes (8, 10, 12, 15) and are lighter than the Raptor. Despite IP67 sealing, they only weigh between 3.5 (8.4-inch display) and 6.9 pounds (15 inch display). Yet, the MSMs are MIL-STD-3009, MIL-L-85762A and MIL-PRF-22885 compliant and have an incredibly bright 1,400 nit backlight in addition to anti-reflective and anti-glare surface treatment, making them viewable under any lighting conditions.

    To learn more about those super-rugged monitors, check Digital Systems Engineering's website at http://www.digitalsys.com.

    Posted by conradb212 at 4:18 PM

    September 30, 2008

    Why is no one using the Marvell speedy and powerful PXA320?

    When we reviewed the TDS/Trimble Nomad last year here at RuggedPCReview.com, I marveled over the machine and noted, "The 800 MHz Marvell PXA320 processor certainly had something to do with it. The difference between it and the 624MHz PXA270 is much larger than we expected."

    In fact, the chip performed so well in the Nomad that I was certain other manufacturers would quickly follow suit and use the formidable PXA320 chip as well. Interestingly, that didn't happen. If I remember correctly, the only other product I've come across that uses the PXA320 is the Aceeca Meazura MEZ2000, which I think is still in the planning stage. Everyone else still seems to be using the older PXA27x, even in new designs. The PXA27x is certainly a good and time-proven processor, but it is no match for the PXA320 when it comes to performance.

    Maybe something is going on that I am not aware of. Maybe Marvell isn't pushing the chip and it's such a secret that no one realizes technology has advanced. Maybe it's too expensive, or has some drawbacks I am not aware of. As is, the Nomad with its powerhouse PXA320 chip appears to continue to enjoy a significant performance edge over anyone else out there.

    Posted by conradb212 at 12:36 AM

    September 26, 2008

    The digitizer mysery

    Imagine if someone had patented hard disks so iron-clad that no one else could make them. Or that an enterprising company had legally locked up LCDs such that it had a monopoly. If that were the case, we might still have giant, sluggish 20 megabyte (not gigabyte!) hard disks and computing as we know it would not be possible. And we'd all get eye strain from using smallish, barely readable antediluvian STN displays. That would be a bad situation. As is, fierce competition propels progress, and as a result we have the most wondrous products brought upon by innovation and improvement.

    Except in one area.

    Digitizers.

    How much progress has there been since I began reviewing pen computers back in 1993? Basically none. And as far as I can tell, that sad situation sits squarely in Wacom's court. Wacom's patented digitizer technologies have resulted in Wacom having almost 96% market share in Japan, and a good 70% in the rest of the world. The Wacom digitizers I used on 1993 pen computers worked, sort of, but were hugely frustrating because it was essentially impossible to calibrate them. The Wacom digitizes I have used in vastly better and more powerful computers in 2008 worked, sort of, but were hugely frustrating because it's essentially impossible to calibrate them. I mean, there are any number of touch screens where you can calibrate 25 points or more, do edge compensation, and all sorts of other cool stuff geared towards enhancing precision and improving the user experience. A Wacom digitizer calibration? Four points, and that's it. Along the edge of the screen, the digitizer is often so badly off that it becomes frustrating to use it.

    I've complained about this for pretty much as long as I can remember, and there hasn't been any change. Anything else in computing has improved dramatically. What gives? Is Wacom's technology inherently incapable of working better? Is no one else able to come up with a better alternative because of patent blocks? I don't know, but between Microsoft's marginal handling of the Tablet PC and the dismal performance of the Wacom digitizer, pen computing is where it is.

    There. End of sermon. I just had to say it.

    Posted by conradb212 at 1:53 AM

    September 2, 2008

    MIL-STD-810F 509.4 and thoughts on salt water exposure

    During a week of scuba diving off Roatan island in Honduras, I had first-hand experience of what salt water exposure can do to equipment. I took several underwater cameras with me for testing and used them on up to four daily dives to 85+ feet with each lasting an hour or more. I thoroughly rinsed off the equipment after each dive, but still found that salt accumulated under rubber coatings, inside screw holes, under screw heads and inside or under anything that allows water to go under or moisture to seep in. After I returned back home I soaked all equipment again in my bathtub and then cleaned each part and component. Without that, adjustment screws, hinges and joints could seize, and the equipment quickly deteriorate due to longer term corrosion.

    I remember when Panasonic showed me the results of their Toughbook corrosion testing on an invitational tour of their facilities in Osaka back in 2002. Without special consideration of salt water and salt fog exposure, there could quickly be appalling damage as shown on the picture to the right (click on it for a larger version). Panasonic explained how they had been approached with requests for such testing, performed the salt water and salt fog tests, and were surprised to see the extent of the damage. They then systematically changed design and materials to ward off or minimize the effects of salt. This benefitted all subsequent Toughbooks, and also showed Panasonic how to develop special solutions for customers who use their products in environments where they are exposed to salt fog and water.

    When you look at these pictures it becomes obvious that sealing alone is not enough when it comes to salt water exposure. Sealing standards only tell how well a product keeps dust and water out of the inside of the unit. They don't tell what salt can do to components that lay outside of the sealing barriers. What can salt do when it gets under a keyboard? Inside a hinge? Underneath protective doors? The result can be ugly. Nothing can ever ward off salt entirely when a product is used in marine environments. Users need to keep computers away from excess exposure as much as possible, and equipment needs to be cleaned meticulously after any exposure. That means that cleaning must be possible in the first place, which means that places that are potentially expose to salt water and fog must be accessible. There are just a whole bunch of additional considerations.

    This is why the famous MIL-STD-810F (Department of Defense Test Method Standard for Environmental Engineering Considerations and Laboratory Tests) document includes a 9-page section on Salt Fog testing.

    MIL-STD-810F Method 509.4 describes testing methods to determine the effectiveness of protective coatings and finishes on materials for corrosion, electrical effect and physical effects. The tests can also determine the effects of salt deposits on the physical and electrical aspects of materiel. The product is exposed to salt fog mist from a 5% salt solution via atomizers at about 95 degrees Fahrenheit for a minimum of four alternating 24-hour periods, two wet and two dry. The product is then examined for salt deposits that can clog or bind components, electrical malfunction, and potential short and long-term impact of any observed corrosion.

    The reason why I am writing this all down is because my return coincided with an announcement from GETAC that its impressive B300 rugged notebook had received Salt Fog certification. Here's part of their press release:

    LAKE FOREST, CA. – September 2, 2008 – GETAC Inc., a leading innovator and manufacturer of rugged computers that meet the demands of field-based applications, announced today that its B300 ruggedized notebook PC received full Salt Fog certification based on testing standards set by the Department of Defense (MIL-STD-810F – 509.4). Salt Fog is a specialized test used to evaluate and determine the effectiveness of protective coatings and finishes on materials to repel salt corrosion and may also be applied to determine the effects of salt deposits on the physical and electrical aspects of materials. Adding the Salt Fog certification to an already robust and rugged notebook PC makes the GETAC B300 the ideal choice for military installations, marine applications such as the Coastguard and other industries where salt or salt air can impact equipment performance.

    “Salt is one of the most aggressive chemical compounds in the world,” said Jim Rimay, president, GETAC. “Salt will quickly corrode a computer’s exterior, impair vital electrical system functions through salt deposits and have a physical impact by restricting free movement of its mechanical components. The B300 addresses these issues with its Salt Fog certification and elevates it to an elite status among ruggedized computers for safe and uninterrupted operation in any location, especially in coastal regions of the world.”

    We recently did a detailed hands-on test of the Getac B300 and found it to be a very impressive machine full of clever engineering and innovation. A combination of optical coatings and superbright backlight make the screen readable in the brightest sunlight, and amazing power conservation methods can extend battery life to a stunning 12 hours. It's good to see that the company also invests in testing against one of the less-often mentioned environmental threats to mobile computers -- salt fog exposure. While most specs include resistance to drops and vibration, salt fog/water exposure can destroy a piece of equipment just as surely. Once the corrosion is detected, it's usually too late, so it's nice to see Getac take proactive steps.

    MIL-STD-810F, however, only describes testing methods, and not the criteria that determine passing tests. It would therefore be nice to know what Getac found during its tests, and what the company did to make the B300 as immune to salt fog damage as possible.

    Posted by conradb212 at 2:31 PM

    August 6, 2008

    The Motion Computing F5

    We've had the Motion Computing F5 tablet here in the lab for a while. The F5 is a follow-up to Motion's C5 medical market tablet, which was a rather unique design solution that received a lot of positive feedback. The folks at Motion are generally right on the mark, and have been ever since some former Dell people formed the company back in 2002 or so to take on Fujitsu with a Tablet PC slate. At the time no one gave them much of a chance to prevail in a market that Fujitsu practically owned with their Stylistic pen tablets, but Motion pulled it off. I remember a dinner meeting with Motion founders Scott Eckert and David Altounian in San Francisco where they showed me the prototype of their initial tablet. It wasn't substantially different or better than what Fujitsu had at the time, but it was immediately obvious that the Motion folks truly believed in their product and that they had a very clear focus. That never changed. Whereas tablets are just a small part of their overall business for Fujitsu, tablets are the only thing Motion does. It's been six years now, and Motion never wavered from their mission. And somehow they always managed to stay ahead of the curve, with new technologies generally available in Motion products sooner than anywhere else.

    I don't know what the thought process was that led to the design of the original C5 medical tablet, but it was certainly a smart decision to go after the medical market. It's a tough one to break into for a variety of reasons, but also one where mobile systems can make a huge impact. At Kaiser, the HMO I use, they finally have terminals in almost every examination room so they can call up patient info, and they can now also call up x-rays onscreen, but it took them forever, and I still see no portable electronics. I suppose it's the same elsewhere.

    The Motion C5 was an attempt to provide a portable computer that could do more and was easier to integrate into the daily workflow of medical people. So they made it small and light and gave it an integrated handle to easily carry it around. They integrated an RFID reader and a bar code reader and also a camera. They also made it white so it fits in with all the other medical equipment, and it's easy to wash and disinfect. Motion also created a small, handy dock for it. So the overall idea was the provide a small computer that was easy to carry around and that included all sorts of data capture methods. It all still depended on systems integrators to package the hardware with medical systems software, and then have hospitals actually pick it up and use it. I am not sure how many did, but the Motion C5 was, and currently still is, probably the best mobile hardware for such projects.

    When I first looked at the C5 I wondered why Motion limited the platform to just one market. True, it's a potentially huge market, but the C5 seemed sturdy enough to be used in other mobile applications, and it already carried IP54 sealing, which means it was didn't mind a bit of rain and some spills. Motion apparently agreed and created a second version of the C5, the F5. They called this one a "Field Tool," -- not the greatest of names, but obviously an attempt at communicating that this computer should be seen as a tool for jobs rather than a conventional computer.

    I must admit, I had a bit of a hard time with the F5. When I wrote about the C5, I had no problem seeing the design decisions that had been made to make this computer just right for the medical market. The size, the shape, the features, the color and so on. The F5 is gray instead of white, but other than that, it's the same computer. It does include Motion's "View Anywhere" display because unlike the C5, the F5 would probably be used outdoors where sunlight viewability counts. So there wasn't any additional thought on how to make a computer best suited for use in the field.

    The way I see it, the field IS different from a hospital. You won't always have a dock to charge a computer, and so the fairly small battery of the C5 may not be enough. And in the field it does come in handy to have a USB port or two and perhaps even an old serial port for some arcane instrument or measuring tool you need to hook up. And having some sort of expansion slot also comes in handy. Wireless communication is great and we can't do without, but it's been my experience that even with Bluetooth and WiFi, there are times when it's a lot simpler to just copy files onto a USB key or a SD card than to send them. The F5 can't do that as it doesn't have any ports or slots and totally relies on wireless or the dock.

    All of this made it a bit more difficult to review the product. I am used to Motion having a very clear rationale for a machine, and in this case the rationale seemed to be that the healthcare C5 was good enough to be offered for other markets. That was probably a good idea, but something still doesn't feel quite right. Even the "View Anywhere" display that I remember as effective from previous reviews of Motion tablets seemed rather low-contrast compared to other sunlight-viewable technologies on the market.

    The F5 is also one of the few machines that uses the Intel Core Solo processor. The Solo is essentially a Core Duo with one core not used, sort of like an 8-cylinder engine with only four of them running to conserve fuel. It is an economical chip, with a thermal design power of just 5.5 watts, which is only a bit more than half of what a Core Duo chip running at the same clock speed uses. Problem is that benchmark performance is much lower, too, and generally closer to the lowly Intel A110 than even an ultra-low-power Core Duo. The F5 is no slug at all, at least with Windows XP, but with Motion always being at the forefront of technology I wonder why they didn't just use an Atom processor instead. They did switch from the Core Solo U1400 to a Core 2 Solo U2200 which is said to include better caching and even more power-saving technologies, so perhaps that was the right move for now.

    Anyway, just a few thoughts on what is, in fact, an interesting and welcome addition to the hardware alternatives available to those who need to implement computing solutions in the field. The official review of the Motion Computing F5, with pics and specs and all is here.

    Posted by conradb212 at 2:03 PM

    June 23, 2008

    Tablet PC: We could use a hammer....

    "We could use a hammer..." That's the tag line of MobileDemand's latest video in their Tablet PC Torture Chamber Series where a man uses a Tablet PC to hammer a bunch of large nails into a board. The video is the latest in a series of increasingly sophisticated and outrageous demonstrations of just how tough their Tablet PC is.

    Usually, rugged equipment is dropped or exposed to water to show that it can survive the kind of punishment encountered in the field. MobileDemand's earlier videos pretty much followed that tradition. xTablets were dropped, exposed to showers, rolled down a hill and so on. But soon the videos showed drops more extreme than anything that would likely happen in the real world. And instead of being exposed to a showerhead, the computer was strapped to the top of a car and run through a car wash five times, with the computer running and its display on camera during the whole ordeal.

    And now the "We could use a hammer..." video. It's very smart. No one would actually use a computer as a hammer (though, come to think of it, I've used a variety of objects as hammers when none was handy), but the image of using that sophisticated piece of electronic equipment as a hammer certainly drives the point home, no pun intended.

    Using the xTablet Tablet PC computer as a hammer really means to illustrate a point: shock and vibration do happen in the field. If you use a machine in a truck or as a data capture device you do not intend to damage it, but sooner or later it will fall. And constant vibration is affecting the computer. Eventually things can happen. Electrical parts may touch and short-circuit. Fasteners may come loose. Structural pieces may crack. Seals may deform and begin leaking. Electrical contacts may become unreliable. The display panel may become get out of alignment. Fasteners and ties may get loose. Wiring may chafe. Materials may fatigue and then break. Parts may deform or crack. And so on. At best, sealing may be compromised, electrical noise may be introduced, and individual parts are headed for failure. At worst, the computer fails.

    This is why manufacturers usually provide test data, usually how a product performed when using the procedures described in MIL-STD-810F. Those procedures try to replicate conditions actually encountered in the field during transportation and operation. That makes sense, but the testing is quite involved and not very easy to interpret. Witness the following caution regrading acceleration testing found in MIL-STD-810F 514.5:

    Care must be taken to examine field measured response probability density information for non-Gaussian behavior. In particular, determine the relationship between the measured field response data and the laboratory replicated data relative to three sigma peak height limiting that may be introduced in the laboratory test.

    That's a mouthful, and the results are even more difficult to read. General integrity test conducted may then yield results such as, say, a power spectral density of 0.04G²/Hz, 20 to 1000Hz, descending 6dB/oct to 2000Hz. MobileDemand, like all the other serious rugged equipment vendors and manufacturers, has its gear tested in accordance with the MIL-STD-810F (and other) procedures, but what has more impact, some tech specs comprehensible only to engineers or a video of a man using the that rugged Tablet PC as a hammer and it still works?

    "We could use a hammer..."

    Brilliant.

    To see the "We could use a hammer..." video, click this Blip.tv link.

    Posted by conradb212 at 3:46 PM

    May 28, 2008

    Electrovaya settles patent infringement suit

    An interesting situation: An intellectual property company named Typhoon Touch Technologies announced Electrovaya had settled a patent infringement lawsuit by Typhoon and Nova Mobility Systems "for an undisclosed sum representing a royalty payment of at least 20% on past and future sales of its Scribbler Tablet PCs in the United States. Additionally, Electrovaya formally recognized the validity of Typhoon’s patents at issue in the litigation and acknowledged infringement of one or more of the patent claims." (see here)

    20% on past and future sales of a tablet? Wow! And recognizing the validity of a patent? That's even more amazing given the vague and confusing nature of many patents. So what is this patent for? That would be US patents 5,379,057, issued January 3, 1995 and 5,675,362, issued October 7, 1997. They both have the same abstract:

    "A portable, self-contained general purpose keyboardless computer utilizes a touch screen display for data entry purposes. An application generator allows the user to develop data entry applications by combining the features of sequential libraries, consequential libraries, help libraries, syntax libraries, and pictogram libraries into an integrated data entry application. A run-time executor allows the processor to execute the data entry application."

    The drawings accompanying both patents show a tablet computer like the ones Momenta, IBM, NCR, GRiD, Samsung, Fujitsu, Dauphin, TelePad, Toshiba and many others offered for sale in the early 1990s. The picture on the right shows the drawing included in the 1995 patent and a couple of computers that precede it. The two computers I added for comparison's sake are a 1993 IBM ThinkPad 700/710 and a 1992 Dauphin DTR1. On the surface it's hard to see how a 1995 patent for a "self-contained general purpose keyboardless computer" could impact a 2008 Electrovaya slate when numerous companies made such computers already in the early 1990s. Then again, patents are finicky things and their interpretation is up to courts.

    Anyway, the patents in question were issued to Microslate, a company that was certainly a pen computing pioneer with its ultra-rugged Datellite touch screen computers (see one of our early reviews of it in Pen Computing here).

    Interestingly, Typhoon also sued Dell, Xplore, Sand Dune (the Tablet Kiosk folks) and Motion for infringement on touch screen technology and seeks damages for lost profits. Motion reached some sort of settlement. Typhoon apparently thinks that the patent in their possession covers just about the entire mobile market: "manufacturing, selling, offering for sale, and/or importing a variety of portable computer products, including but not limited to tablet PCs, slate PCs, handheld PCs, personal digital assistants (PDAs), ultra mobile PCs (UMPCs), smart phones, and/or other products covered by the patents-in-suit."

    The suit has a co-plaintiff in Nova Mobility Systems, located in Tempe, Arizona. Nova, interestingly, offers the SideARM handheld. The SideARM was originally conceived by long defunct Melard and then became part of Microslate's lineup, the very company that was assigned those two patents. Typhoon's Form 10QSB shows that they bought the patents from Nova Mobility and agreed to pay them a 10% royalty from enforcements. So Microslate, an early player in the rugged slate market, sat on the patents all this time, then sold them, and now they are supposed to cover virtually every mobile device ever made even though such devices existed long before the patents? Elegant.

    We're all in favor of respecting intellectual property, but figuring out what exactly that means isn't always easy. When I was a kid many decades ago I envisioned a little black box that told me everything I wanted to know by simply asking a question and let me communicate with anyone who had one. I doodled drawings of it. Does that mean I own the exclusive rights to cellphones, smartphones, Google and the entire web? Sadly not. But it would really be nice to at least have 20% of all those sales.

    Posted by conradb212 at 4:36 PM

    May 26, 2008

    XP Embedded: When benchmarks lie

    Providing rugged mobile computers is a constant exercise in trade-offs and balancing. Screens get bigger and brighter, processors get fasters, disk larger, and customers want all that, without paying for it in the form of larger batteries and more weight. The problem, really, is that battery technology has not kept pace with the rest of the circuitry inside a computer, and so batteries struggle to provide enough juice to keep everything running for long. When you think about it, it's pretty bizarre that the very machines that are supposed to go as fast as possible often annoy their users by constantly trying to go to sleep, stand by, hibernate or shut off. Or that they come factory-configured to run at half speed and with the backlight dimmed.

    The increasing power demand of the latest electronics (and in the processor department, their cost) has driven many manufacturers to look for alternate solutions. One is to pick a much simpler processor that consumes a lot less power. That approach, however, has its own problems. Two primary ones, in fact. The first is that customers think a machine with a "slow" processor cannot possibly be very powerful. And second that, in fact, it isn't. Fortunately there's a solution, albeit one that is only suitable for certain tasks and applications.

    An embedded operating system.

    See, a general purpose OS, like Windows XP Professional, is just that, general purpose. You can do anything you want with it, and run anything you want on it. With that in mind, Microsoft equipped Windows XP with all the drivers and software and utilities one could possibly need. The result is a rather large operating system with numerous processes and services running all the time, all consuming memory and power, and having the potential to slow even a powerful machine to a crawl.

    An embedded operating system is totally different. The idea is to only use what you need to perform a certain task and leave everything else behind. This greatly reduces the size of the operating system and dramatically reduces hardware requirements. XP Embedded is generally used for smart, connected and service oriented commercial and consumer devices that do not need all of Windows XP, yet can still run thousands of existing Windows applications. An embedded OS can easily be as small as 40MB and it's even possible to cut it all down to around 8MB with a bootable kernel.

    XP Embedded is not one-size-fits all. A company will determine exactly what a machine is for and what it should be able to do. They then include as many components (hence the term "componentized" operating system) as they need. There are over 10,000 available and it's easy to create lean, nimble embedded OS platforms that can still do sophisticated high level tasks like advanced multimedia, browsing, communications or whatever a task requires. An embedded OS can even run as a real-time OS via third party plug-ins. Essentially you get the power of the basic Windows XP engine, but without any overhead you don't need.

    Which means that in an embedded systems machine, benchmarks do not necessarily tell the true story. They simply measure raw power, but not how efficiently that power is put to use. What all this boils down to is that a mobile computer with an embedded OS can be much faster than you'd think it is based on its hardware specs. In fact, we reviewed some that were so quick that almost no one would believe they ran on a low-power, inexpensive processor and just a minimum of RAM. So benchmarks would tell one story, real world performance another.

    This is not to say that an embedded OS is the perfect solution for all mobile computing tasks. But it can be for organizations that build their own customized, componentized OS. And for those who have very clearly defined applications that work within the confines of an embedded OS.

    Posted by conradb212 at 3:50 PM

    May 21, 2008

    What happened to Symbol!?

    Symbol Technologies was always one of my favorite companies. I visited their headquarters in Holtsville, long Island several times over the years and always came away impressed with their sleek designs and willingness to try out new ideas. That feistiness carried over into some aggressive acquisitions (like the bitter fight with Telxon) and, after some financial incongruencies, the sale of Symbol itself. Now Symbol is part of Motorola, but it isn't very clear what kind of part.

    A good year or so after the acquisition Symbol seems to have been halfway absorbed into Motorola, but if you go by the Motorola website it's almost impossible to figure out how. Symbol is only listed as carrying bar code scanners, mobility software, and OEM scan engines, but no longer any handheld computers. The former Symbol handhelds have become sort of stateless, popping up under "Mobile Computers" without any brand name at all. So the former Symbol MC50, for example, is now just a "MC50," presumably somehow by Motorola.

    It's actually quite sad to see all that. Symbol's once proud state-of-the-art handhelds now languish, carrying on in some way with dated processors and even more dated software. Some have unceremoniously been discontinued whereas others seem destined to just die from neglect. The MC35, MC50, and MC70 had a very promising career ahead of them when they were introduced, but now they are aging rapidly. The emphasis appears to be on the big and fairly conventional MC9000 Series of handhelds. They come in a variety of permutations with various size keypads, and they remain reasonably up-to-date with Windows Mobile 5.0 and Marvell (why does almost everyone still call them Intel when Intel sold the business a long time ago?) PXA270 processors.

    There may well be method to this madness, and the decision to focus Symbol entirely on scanners may be a good one. Obvious it's not. And it's truly sad to see Symbol's proud legacy of handheld computers rapidly go to seed. I mean, make them part of.... SOMETHING!

    Another sad thing is Motorola's website itself. It must rank right up there with the most confusing, least user-friendly ones I've seen. It's not surprising the company is in such trouble. The impression you get along every step of the way is, "We don't know who or what w are, or what we want to be!"

    Frankly, as is, I think Symbol, and its customers, would have been a whole lot better off with Symbol intact and independent. Spin them off so they can get back to business, Motorola.

    Posted by conradb212 at 1:21 AM

    May 6, 2008

    A video says more than a thousand pictures

    While it's still not entirely sure how the YouTube phenomenon is changing our view of the world, changed it has. Initially we thought YouTube and its many competitors were simply repositories for stuff people recorded off TV, but that has changed. These days, if anything happens anywhere, whether it's important or not, it'll be on YouTube in a moment.

    However, the YouTube phenomenon has also led to entirely more serious changes in how things are being portrayed to the world. Specifically, video is being used to show what products can do. But that's not new, you might say. No, the idea of using video to highlight a product is not new, but the way video is being used now is. In the olden days, videos were mostly polished commercials, the kind we watch on TV (unless we have TiVo). YouTube gave video sort of an underground flavor. It's not glitzy footage created by Madison Avenue types, but clips done by us, the people.

    Last fall, for example, we thought it might be fun to do an underwater video of one of the products we reviewed. It was by no means professional quality; we just used a little Casio digital camera with a YouTube mode. Then we set up a tripod in a pool, I donned my scuba gear and, bingo, video of a handheld computer being used underwater. This went up on YouTube with a rather innocuous title, "Trimble Nomad computer goes diving." Amazingly, even with this non-provocative title and very utilitarian keywords (trimble, tds, rugged, scuba, waterproof), the video has been viewed over 4,000 times in the few months since. Another one we did a bit later, of the Juniper Systems Archer Field PC, has also been viewed almost 2,500 times. Hmmm....

    Turns out, an increasing number of entrepreneurial companies are taking advantage of the YouTube phenomenon by rolling their own underground videos. One of our sponsors, MobileDemand, has been playing a leading role by creating a number of videos that demonstrate the toughness and ruggedness of their xTablet slate computer. The result is a series of increasingly better and more outrageous videos that are both funny and compelling. While I never warmed up to Panasonic's omnipresent "Legally we can't say..." commercials/videos/billboards/print ads, MobileDemand makes their point much more convincingly (and at infinitely lower cost). And while the origins of the idea are clearly based on the YouTube syndrome, MobileDemand is running its videos on Blip.tv which has much better video quality.

    If you haven't seen one of the MobileDemand videos you can do so right here by running the clips embeded in this paragraph. You see their flagship product being tossed around, thrown off a hill, and strapped to the top of a car and taken through a car wash. In a loose adaptation of the MIL-STD-810F "drop test" (officially called MIL-STD-810F Method 516.5, Procedure IV -- Transit Drop), you see the xTablet being dropped, rapid-fire, 26 times. To drive the point home they use the computer to pound a nail into a wooden board. All the while, video is running on the computer's screen so you can see that it still works and never skips a beat. That's pretty clever. Oh, and knowing that outdoor footage of a screen that is not outdoor-viewable isn't exactly compelling, the MobileDemand folks smake sure it's abundantly clear that theirs IS outdoor-viewable. It's all done in a fun, "YouTube" way. To demonstrate that their tablet's display, usually the most vulnerable part of a rugged computer, can take a direct hit, they drop a full beer can onto it. And then, to make sure folks realize that a beer can dropped from a few feet packs a punch, they drop one onto a guy's midsection. Ouch!

    A video can clearly say more than a thousand pictures. That's because we've all become jaded with mere images. We all know how easily they can be edited, modified and faked. Video, that's another story. It's hard to fake a video of a guy hammering a big nail with his computer. Which means, for now, demonstrating products on funky videos is a great idea. It certainly doesn't replace images or the printed word as video is a serial medium that you pretty much have to watch from start to end as opposed to glossing over "random access" print.

    Posted by conradb212 at 7:59 PM

    March 18, 2008

    Shrinking military spending an opportunity for mobile vendors?

    What I am about to write is based on assumptions and conjecture. It has to do with military procurement. And more specifically, military procurement of rugged mobile technology.

    We've all heard about the proverbial $600 toilet seats and other supposed gross waste of resources. We also somehow assume that the military has ultra-advanced equipment and secret weapons that are more sophisticated than anything we can think of. In the same respect, having served in the military, I know that the armed services often use equipment that, by civilian and commercial standards, is completely and utterly obsolete. So what is true? That the military has incredible gee-whiz weaponry and gadgets, or is it all tried-and-true (and rather old) stuff?

    Most likely some of both. When you peruse the product lineups of some of the defense contractors you see some shockingly obsolete stuff in there. Machinery powered by ancient Pentium chips, murky LCDs, a complete lack of modern interfaces and so on. Heck, our fighter planes are positively ancient if you applied the standards of, say, the automotive industry. Sure, they are said to be equipped with the latest computer gadgetry, but still, how up-to-date can decades-old designs be?

    Anyway, I really want to talk about how all of this relates to the cost of rugged mobile equipment. In a recent summary report, Venture Development Corporation (VDC) reported that military spending on expensive rugged mobile technology may dry up in coming years. They also stated that this will leave an interesting opening for a new class of "good-enough" hardware that can fill most requirements, or all, at a considerably lower price. What this means is that the military may stop paying premium prices for traditional military market equipment from traditional military market vendors. So instead of simply ordering a successor model from an established (and presumably expensive) vendor, they may look around for less costly alternatives.

    This indeed may present an interesting opening for some companies that have not traditionally dealt with the military market. It also means that such companies will have to take a crash course in how to deal with the military, learn more about requirements and certifications, and about service and sales cycles. Truth be told, we've seen a good number of "civilian" rugged handhelds that we believe could serve the military quite well whereas some of the traditional gear makes us wonder about its usefulness.

    So are some vendors just a small learning curve and a few modifications away from being serious contenders for armed forces contracts? Or is dealing with the governments simply too cumbersome to even attempt for anyone other than the handful of defense contractors?

    Costs, of course, are relative. Given that a very simple ankle fracture without any complications or anything cost a friend of mine the appalling amount of $28,000 five years ago, I can only imagine what the military's health care cost must be. Perhaps, compared to that, it simply doesn't matter whether a handheld costs $1,500 or $5,000.

    Posted by conradb212 at 7:40 PM

    March 10, 2008

    Keeping track of who makes (and sells) what

    Keeping RuggedPCReview.com updated is no easy task. In the olden days, when we started Pen Computing Magazine back in 1993, there were only a small handful of companies that offered ruggedized equipment. These days, a even giant companies like Dell are realizing that adding durable and ruggedized equipment makes a lot of sense. I mean, in a mobile world not everyone is well-served with a flimsy, plasticky notebook that can't handle the potential abuse during a day on the job.

    Anyway, keeping track of things... Not only is it quite a job to stay on top of every tech upgrade (and with Intel adding and changing processors every few weeks those come hot and heavy), it's often even more difficult figuring out who makes what and where it's being sold. For many years now, most notebooks sold in the world have been made by a fairly small number of Taiwanese and, increasingly, Chinese OEMs. For a while we licensed Pen Computing Magazine to a publishing company in Taiwan and I had a chance to go to Taipei to see them and also make a presentation on Tablet PCs in the Taipei International Convention Center. My hosts arranged for interviews with most of the major OEMs, such as Compal, Quanta, Mitac, FIC, Tatung and so on. That was very informative, but it's difficult to keep track of the ever-changing alliances between OEMs, ODMs, resellers, partners and customers.

    So what does that mean for all the hundreds of rugged products listed and described at RuggedPCReview.com? Most are manufactured, though not necessarily designed, by an OEM in Taiwan. Many are joint productions where a computer company designs a product and then has it built by an OEM. Or the various aspects of design are divided in some way. Or a product is available from several vendors, but is customized for particular markets for different vendors. Sometimes there are exclusives. Other times the same machine is sold under different labels. There are also cases where an OEM sells a product under its own name, but that same product is also sold by other companies under different labels. This whole big supply chain means that there are many different ways of working together.

    As for us here at RuggedPCReview.com, we always try to know who exactly makes a product. That's primarily so that we can state facts. If a product is really good, we'd like to know who deserves the praise. It makes no sense to heap praise on an OEM when the design actually comes from elsewhere. Or, the other way around, celebrate the genius of a reseller when they really did not design the product at all.

    But that's not all of it. Another problem for us is that larger resellers do not necessarily offer the same machines in all markets. This morning, for example, I updated some product listings and realized that some of the old Dolch products were still listed under Kontron, the German company that had taken over Dolch in February of 2005. We had often marveled at Dolch's various rugged platforms at industry tradeshows and were bit saddened to see them get absorbed. After all, Dolch had been building rugged machines since 1987. So we relisted whatever Kontron took over as Kontron machines and added new contact information. Kontron had also created a new website, kontronmobile.com.

    At the time, Kontron's CEO was quoted as saying, "This investment presents an excellent opportunity for Kontron to further expand its embedded computer solutions in the USA and Europe on mobile platforms for government and defense programs." Well, apparently it was not such a great opportunity after all as Kontron's US website now states, "Thank you for your interest in mobile rugged computing. This line of products was recently acquired by Azonix, a division of Crane Company." Azonix so happens to be a division of Crane, a multinational with over 10,000 employees. Azonix Corporation is located in Billerica Massachusetts and was set up in 1981 as a design and manufacturing firm specializing in rugged, high-precision measurement and control products. Some of the former Dolch/Kontron products are now part of the Asonix Military Grade Solutions product lineup, in competition with the likes of DRS Tactical and General Dynamics.

    The Dolch/Kontron/Asonix NotePAC, however, looked familiar to me and it turns out to be a GETAC machine, the A790. On a hunch I go to the German Kontron website and it turns out that Kontron continues to sell rugged notebooks in that, and other, markets, just not in the US. In fact, the German Kontron lineup does not hide its GETAC origins. They have a whole line of Kontron NotePACs, all carrying the same model numbers as the corresponding GETAC machines.

    Nothing wrong with all that, of course. It's just another example of how everything is going global. But after all is said and done, customers need to know who they can call if they need service and support. And then it is good to know they're dealing with a reliable, competent company that doesn't just slap a badge on a machine and pushes it out the door. In the end, it is that support and that local connection that matters and factors in big in that holy grail of vertical market mobile computing, the Total Cost of Ownership.

    Posted by conradb212 at 6:48 PM

    March 3, 2008

    Where will Intel's Atom chip fit in?

    On March 3rd, 2008, Intel introduced the low-power Atom processor designed specifically for mobile internet devices. While desktop chips draw as much as 35 watts of thermal design power (TDP) and even ultra-low power Core Duos draw almost 10 watts, the Atoms will draw from 0.6 to 2.5 watts. Intel stresses that the chip is not a shrunken version of a desktop chip, but designed from the ground up. In a series of YouTube-style videos various Intel spokespeople describe Atom's use. It goes into really inexpensive ($250-400) notebooks. It is "Intel's architecture for mobile devices." It is for "devices that fit in pockets." And it is "the basis of new sexy: low power and small." And no fan is needed. Does this mean the Atom processors are meant to replace replace the ARM-based PXA processors that Intel jettisoned to Marvell?

    It's really confusing with processors these days. Back in the early days of mobile computing everyone knew what to expect from an 8088 processor (including price, which was about $5), and then, say, a 386/16 or a 486/33. People even had a "feel" for how fast a Pentium 90 was going to drive an early Windows computer. Later, Intel's product lines mushroomed, but it was still kind of possible to guess how each would perform because in the public's mind, the clock speed of a computer chip determined how fast it was. Then Intel did away with that also, sort of, and now we have slower processors that are faster and faster ones that are slower. Processors are no longer sold on their specifications, but on what wonderful things Intel says they will do for us.

    For those of us in the mobile field, one problem with Intel has always been that the company really had no mobile chips. Whatever found its way into notebooks was generally a crippled desktop processor. Sometimes crippled in terms of technology (like when one of two cores was simply disconnected as in the unloved Core Solo) and sometimes by running the poor thing with so little juice that it barely moved.

    But Intel also had the PXA processors specifically developed for handheld devices you may say. Yes, they had, and it is not entirely clear why. Think back to the beginnings of Windows CE in the mid 1990s (it was introduced at Comdex 1996 to be exact). Windows CE began as a multi-processor architecture platform. Unlike desktop Windows PCs that almost exclusively relied on Intel, CE devices had a choice of several chip architectures. There was support for Hitachi's SuperH architecture and two variants of Silicon Graphic's MIPS engine, and then Microsoft announced support of the 486 and Pentium, the PowerPC 821, and the ARM architecture. I don't think the first three ever became real, but ARM support sure did. Anyway, the competition among chip manufacturers was heavy and resulted in sort of an "arms race" to deliver faster and more integrated chipsets. There quickly were faster versions of the Hitachi SH-3, Philips introduced the TwoChipPic set, and NEC the 4100 family. Toshiba announced its entry with the MIPS-based TX39 family of RISC processors (perhaps one of the quickest CE chips ever), and Digital Equipment Corporation the StrongARM 1100. And there was AMD with its 486-compatible Elan variants. Now that is competition.

    Sadly, all that changed with Pocket PC 2002 when Microsoft dropped support of the MIPS, SH, and X86 architectures and mandated the use of an ARM core, which at the time was the SA1110 "StrongARM," and the ARM72xT and ARM92xT. That swiftly eliminated a whole bunch of CE device manufacturers from the market, and some never came back. At least, we thought at the time, ARM processors were made by Intel, Motorola, Texas Instruments, and ARM itself, but even then we assumed that there would be an emphasis on the Intel StrongARM and Intel's Xscale architecture.

    XScale, of course, prevailed and was soon found in virtually all Windows CE devices. Now let's remember that StrongARM really wasn't an Intel invention at all. It originated with none other than the once mighty Digital Equipment Corporation, the supermini powerhouse that once seemed destined to replace IBM, but then meekly imploded and sold itself to Compaq, which meekly imploded and sold itself to HP. Somewhere along the process Intel picked up StrongARM and quickly morphed it into XScale. I remember several somewhat awkward conference calls where Intel reps tried to explain how XScale was different from StrongARM. In the end it really didn't matter as the Intel PXA chips became fairly competent workhorses for millions of Windows CE-powered devices.

    However, XScale had fatal flaws. First, it couldn't run "real" Windows. Second, it wasn't a very lucrative business. And third, it was not invented here. So off it went, to Marvell. Marvell Technology Group -- a silicon solutions high tech firm based in Santa Clara, California -- officially took over Intel's communications and applications processors in November of 2006 and has since launched the PXA 3xx series, consisting of the high-end PXA320 running at 806MHz, the cost-optimized low-end PXA 300, and the PXA310. The 806MHz PXA320 is a scorcher as we found out in a review of the Trimble/TDS Nomad rugged handheld. Unfortunately, Marvell's marketing is so low-key that hardly anyone knows they exist. Check the tech specs of just about any Windows CE device and it still says "Intel PXA." And despite the remarkable power of the PXA320 chip, few have picked it up. Shame, that.

    So now we have the Intel Atom chip. Designed from the ground up for mobile devices. Designed for cheap computers costing just 250-400 bucks. Not a shrunken desktop chip, but still one with 47 million transistors. One that goes into devices that fit into pockets but also on desktops, and those inexpensive notebooks. And then there's the new sexy, "low power and small." Why "Atom"? Because "it's the smallest element of computing."

    Along with the Atom chip also comes Atom Centrino. With "Centrino" being a rather successful Intel strategy of bundling various Intel components and making the package look superior to just an Intel processor and then third party components, Centrino Atom is no surprise. Centrino Atom will include an Atom chip and companion chips for graphics and wireless for "the best mobile computing and Internet experience on these new devices."

    The thermal design power (TDP) specs are certainly impressive. Just 0.6 to 2.5 watts, as opposed to almost ten for an ultra-low power Core Duo processor. And the 45nm process is unimaginably microscopic (the PXA processors use 90 nm) and certainly a testimony to Intel's expertise. Thermal design power, of course, is a somewhat odd measurement. It just describes, according to a Wiki entry, the "maximum amount of power the cooling system in a computer is required to dissipate."

    To me, the question is where the chip will really fit in. One of the Intel clips has the spokesperson showing an OQO type of little computer with a slide-out keyboard. Quite obviously, the overall goal is to provide the kind and quality of internet access we've all become used to, and even more so since Apple showed that "real" browsing is possible even on something as small as the iPhone.

    So what does Atom mean for the manufacturers of all those PXA-powered devices? With Marvell taking such a low-key approach, are they hustling to see if Atom perhaps is a better alternative? I am certain Intel hopes so. What are the respective power requirements? I don't think I've ever seen a TDP spec for the PXA chips. Whatever specs there are for the PXA320 would indicate substantial capabilities and power, but so far we haven't seen any device that takes advantage of all of its remarkable range of multimedia features (see Marvell PXA320 features).

    There are, of course, other considerations. For example, we're seeing new products with Intel's A100/A110 chips that are part of Intel's UMPC 2007 platform. Those chips, essentially lower power M-cores, also use 90 nm technology, run at 600 and 800MHz and have 3 watt TDPs. Will these be totally replaced by the Atom chips that appear to have a range from 500MHz to 1.8GHz at lower to equal TDPs?

    Time will tell.

    Posted by conradb212 at 5:53 PM

    February 20, 2008

    What do we make of Geode, VIA and Intel A100 powered devices?

    As of late, I've seen an increasing number of small tablet-style devices that run Windows but do not use one of Intel's heralded Core processors, or even one of their lower-powered predecessor chips. That inevitably brings up the central conundrum the industry has been dealing with for the past 15 years or so. After dabbling with Windows CE in its various versions, Microsoft has pretty much decided that "real" Windows is the way to go. Any device that is not solely dedicated to performing a single task, or running a single custom app, will likely do other things or have to communicate with other computers. And that is when the problems start. Anything that doesn't run "real" WIndows will inevitable have browser problems, drivers and plug-ins aren't available and so on. Might as well give up and build a small device with real Windows. That can be done, but real Windows was designed for desktops and powerful laptops. It wants plenty of processing power and a big screen lest it all becomes an exercise in frustration.

    So here we are, with Vista taxing even the most powerful machines and even XP desktops struggling to keep up with the myriad of functions and giant applications and add-ons and start-up programs and other gunk. Heck, my own personal 2GB Gateway notebook takes so long to boot Vista or bring up programs that I usually have meandered off to some other task by the time it's done. And yet, I see Microsoft plugging its Intel Ultra Mobile Platform 2007 with its A100 and A110 processors running at 600 and 800MHz, and AMD's Geode LX800 and LX900 at 500 and 600MHz. VIA's ultra low voltage C7-M runs at 1-1.5GHz and is probably in a somewhat different class, but in all instances we're far from Intel Core Duo and Core 2 Duo specs.

    The question simply becomes this: Can a tablet powered by one of these chips really run Windows XP without its owner quickly giving up on it because it is too slow?

    Unfortunately, there isn't an easy answer. See, it's really all a matter of software. Let's not forget that a couple of decades ago perfectly functional computers booted faster and ran their spreadsheets, wordprocessors and databases faster than what we have today, all on a few meg of memory and 16MHz processors. We have vastly more functionality today, but it's all become so complex that it often barely moves, and that is WITH powerful processors.

    So why not simply scale back the software? That's a good idea but far from simple. If we only could just load Windows 98 onto a new machine and make it do whatever we need. It'd probably fly even on a -- by today's standards -- vastly underpowered machine. Sadly, it'd also be almost useless because it couldn't connect to anything and be incompatible with almost everything.

    So the answer is to use today's software that speaks today's protocols and runs today's drivers, but remove as much overhead as possible. That can be done in several ways. You can, for example, load a standard operating system but do away with all the clutter and shovelware today's computers come with. You also remove all unnecessary startup programs, all unneeded background processes and so on. That still results in a big system, but it's surprising how much speed can be recovered by putting Windows on a diet.

    Another approach is using Windows XP Embedded. What does "embedded" mean? Basically that you only pick those parts of a componentized operating system that you absolutely need for a task. Standard Windows XP or Vista load a computer with everything under the sun, whether you ever need it or not. An embedded version of Windows XP has ONLY what a device needs to do its job. That means it will be limited, but it will also be faster and use fewer resources. XP Embedded is especially well suited to run on a relatively small flash disk.

    Yet another approach is to use one of the various Linux variants. Standard Linux distributions also have grown over the years and they now need much more space and have far larger resource requirements than they used to, but they are generally still smaller and faster than Windows. And since Linux is free and all its major applications are free, there can be substantial cost savings. Not everything is free, of course; companies who create custom applications to run on Linux systems can and will charge for licenses and upgrades.

    All this gets me back to the original question: can a small slate computer with a minimal processor and minimal resources really run Windows at an acceptable pace? Does it all make sense? Some rather prestigious manufacturers seem to think so. Getac announced its lightweight rugged E100 tablet that uses an Intel A110 chip. Roper Mobile Technology announced the Geode-powered Duros Tablet PC. HTC's intriguing "Shift" can run both Windows and a clipped version of Windows Mobile, and Windows runs on an Intel A110. And there is a whole slew of other small devices that roughly follow what once was the Microsoft "Origami" ultra-mobile PC spec. All do Windows, and all use one of those ultra-economical processors (I hate the term "low-power" as it implies low performance rather than high energy efficiency) that is supposed to provide an adequate user experience while still providing halfway decent battery life.

    What I'd really like to do, and I hope we get a chance here at RuggedPCReview.com, is to compare the Windows XP, XP Embedded and Linux versions of some of those machines side-by-side. I somehow cannot image that anything that runs XP on a 600MHz processor will be blindingly quick when even my 3GHz desktop is a slug, but it's entirely possible that a lean and specially configured rugged tablet with one of those high-efficiency (see, I didn't say "low power") processor is just what the doctor ordered.

    Posted by conradb212 at 8:05 PM

    January 24, 2008

    Panasonic -- Still top of the heap?

    We just finished taking another detailed look at an old acquaintance, a Toughbook from Panasonic. Now called the CF-30, it's a descendant of the original Toughbook that goes back many years and essentially created a whole new market. The way that came about was that a number of Japanese companies that had once dominated the US laptop market found it increasingly difficult to be profitable. At some point the US launched protectionary measures against TFT LCD panels, making them more expensive. And the Taiwanese were beginning to move in.

    Panasonic's approach was to seek new ways and they decided to gamble on a niche they had discovered. As notebooks were increasingly used in the field, customers became unhappy with standard laptops breaking all the time. It really wasn't the laptops' fault. They were built to be used at home and in an office, and then being shuttled back and forth. But with companies now deploying them for all sorts of field applications, they just couldn't handle it. So Panasonic conceived the idea of notebooks that were as elegant and powerful as standard laptops, but a lot tougher. And they came up with the "Toughbook" moniker, which was brilliant.

    For many years, Panasonic owned the market. It wasn't that they were so much better than the rest, but their products sure looked better, and they had giant Matsushita behind them, so there were plenty of resources and off-the-shelf components right inside the company. And they knew the importance of industrial design. Compared to the utilitarian-looking competition at the time, Panasonic's ruggedly handsome Toughbooks were simply in a league of their own.

    Panasonic also did a terrific job working with the press. In the heydays of vertical market print publications, when we did Pen Computing Magazine, Panasonic's PR folks always made sure we were informed of every new product. They made review units available and just generally helped us in every way to get information and hands-on time with the units so that we could keep our readers informed. So we reviewed many Toughbooks, liked most and criticized some. Panasonic was always appreciative of feedback and apparently passed constructive criticism on to their engineers as the machines steadily improved.

    But time does not stand still, and the only constant is change. The rest of the industry began catching up and Panasonic, as the market leader, had a bullseye on their back. They were everyone's target. All of a sudden, superb industrial design was no longer exclusively found at Panasonic. One look at currently available rugged and semi-rugged notebooks shows that it's a real race now, and one where Panasonic no longer automatically has an edge.

    There are other issues. Relationships matter, and after many years of superb access to Panasonic through a couple of long-term PR people, things changed and it became next to impossible to get anything from Panasonic. Seemingly every contact with them was from a different PR person. So when we emailed one of them, s/he was already no longer with the company, or the PR firm had changed. Not good. Whoever we deal with does their best, of course, and sometimes things just cannot be helped.

    Anyway, we finally did get another longer term hands-on with a Toughbook. As described in detail in our review on the site, the Toughbook CF-30 is almost unchanged. Which is really a good thing. After all those years, that particular platform -- the traditional full-size rugged notebook -- is as mature and perfected as it gets. And having talked to Matsushita's engineers and designers In Japan, and having seen the production facilities in Osaka and Kobe, I am not surprised at the extremely high level of execution, fit and finish. It's probably nearly impossible to meet Panasonic's sheer perfection when it comes to do wizardry with magnesium or applying the most eye-catching finish to it.

    And Panasonic certainly keeps the machine technologically up-to-date. The one we reviewed had an Intel Core Duo processor, but by the time the review was over, in January 2008, Pana had already revved the machine again and it now has a Core 2 Duo and a few other enhancements, albeit not enough to change the name from CF-30 to CF-31 just yet.

    Outdoor viewability is becoming ever more important, and there has been a lot of progress in that field. Our technology editor, Geoff Walker, is an expert in that field, and thanks to him we have a pretty good idea of the state-of-the-art. From what I can tell, and from what I have seen with my own eyes, Panasonic is not completely at the forefront with their outdoor displays, but they are close. No display is anywhere near perfect yet, but the progress that's been made is amazing, and current technology can only do so much against the sun.

    But is the CF-30 still on top? That's hard to say. In terms of look and finish, it remains unsurpassed, but it is an aging platform. The touchpad was just plain unresponsive and certainly didn't make the machine easy to use. In the olden days, a quick call to our sources at Panasonic might have yielded an explanation as to why a particular type of touchpad was used, but these days the path of communication is longer. Fortunately, today's company websites contain so much information that grabbing a missing spec is usually just a lookup away, but, alas, as pretty and professional as Panasonics Toughbook website looks, it is a total bear to navigate and find anything. If it takes me several screens to actually find a product, something's wrong. And the confusing, inconsistent way Panasonic literature and online resources handle ruggedness specs is not doing them any favors. And Panasonic's "Legally we can't say...." campaign we're assaulted with in every airport or business magazine, well, the less said the better.

    But what about other Panasonic products? Well, most are still there and more or less the same. I saw the prototype of the very compact CF-18 notebook convertible at Panasonic in Japan back in 2002, and we later reviewed the final product. It's almost six years later now, and the CF-18 is now the CF-19. Is it still the best? Maybe, maybe not. GETAC's V100 competes with it now, and when we reviewed that rather excellent machine we wondered whether Panasonic has kept up.

    Don't get me wrong. The Panasonic CF-30 is an awesome machine. But the world has changed, and it's not clear to me if Panasonic has made all the right moves.


    Posted by conradb212 at 5:05 PM

    November 22, 2007

    Thoughts about rugged handhelds -- the Juniper Archer

    For the past few weeks we've had an Archer Field PC from Juniper Systems. "Field PC" is perhaps a bit of a misnomer as "PC" generally implies a Windows-based computer. The Archer is Windows-based alright, but it's Windows Mobile, so it's really a Pocket PC or whatever Microsoft is trying to call handhelds these days. We still generally call these machines Pocket PCs, or just PDA, the term Apple originally used when it came out with the Newton back in 1993.

    Creating a "rugged" PDA isn't easy. And just like "rugged" notebooks or slate computers, the degree of ruggedness varies greatly. Commercial products really don't have that problem. It's the electronic guts and then a plastic case that should look good, be small and light, and hold up in daily use. It doesn't have to be waterproof or be able to absorb punishment, like drops or getting crushed and so on.

    For mobile computers used in field work, things are very different. If you use a machine outdoors, all sort of stuff can happen. For one thing, outdoors is not an air-conditioned 72 degrees all year round. It can get very cold and very hot. Some electronics don't like that. Also, outdoors it rains. And sometimes pours. And a handheld terminal may even fall into a puddle or get sloshed by water some other way. Dropping it is a distinct possibility. And that generally happens when you pull it out of a bag or Pocket, or while holding it. So it should survive four to five feet drops. There's other stuff to consider. If it goes up in a military airplane, pressure may be an issue. If it's strapped to a truck, vibration can be the killer issue. And in certain flammable environments it is imperative that there is chance the device can ignite things with a little spark or arc. There's more, but one thing isn't usually listed: if a device must be rugged, it's likely going to be used outdoors, and outdoors there is sunlight. So the display must be readable outdoors. That's never included in ruggedness specs as it is, technically, not an environmental exposure issue. But it's part of what a rugged device must be.

    So how do manufacturers go about building rugged handhelds? In many different ways. While the guts of a Windows Mobile/CE device are fairly standard, rugged housings most definitely are not. As a result, almost everyone does it in a different way. Here at RuggedPCReview.com, we love looking at, and analyzing, those different design approaches.

    In a way, making a handheld tougher is not that different from making a slate or notebook computer tougher. Seek the traditional weak points and eliminate them. Consider all possible accidents and challenges and address them. And since building a rugged device usually means higher cost, larger size, and higher weight, have a very clear view of what exactly you're trying to achieve. The design must be just right for its intended use.

    So how does all that apply to the Archer handheld build by the friendly folks at Juniper Systems in Logan, Utah? Well, they have a history in catering to agricultural markets, then branched into all sorts of other outdoor markets, like surveying, forestry, fisheries and so on. So whatever they build should be fairly waterproof, able to handle a drop and just generally be a tool that its owner can take along on a hard day's work in the wild, without having to baby the computer.

    When you first see the Archer, and usually you see the one with bright orange protection molding, it has a friendly look that is far removed from some of the deadly-serious designs that, if they were in a Pixar movie, would probably say, "Sir, unless you're military and have proper clearance, you are not authorized to touch me. Please step away." When you look at the Archer, alas, pumpkin comes to mind. Same orange, same texture. That provides excellent visibility, which is a good thing if you accidentally dropped it in the woods and then have to backtrack to find it. For that, bright orange is much better than camouflage.

    But take a closer look and the Archer is a rather nasty wolf in sheep's clothing. The friendly elastomer overmold comes off easily and underneath it's a hefty case made of magnesium. Hefty as in you could probably take a sledgehammer to it. I described all of this in the review, but seeing this "compartmentalized" approach to designing a rugged device was really interesting. They Juniper engineers must have said, "Look, if we enclose the whole box in a waterproof and dustproof shell, how are we going too have connectivity? Hardly possible. So let's separate things into a totally sealed core and then protect that with rubber molding that can easily be replaced. And we just seal the electronic contacts and leave the actual jacks exposed. Think that'll work?"

    It does, with some limitations. The Archers housing is certainly an "armored core" and invulnerable, but dust and water can get into the jacks and other places. Which means the Archer DOES have great connectivity in an ultra-rugged device, but if it falls into the water or hits a dust storm it will not fail, but afterwards you have to take it apart and dry and clean everything outside of the armored core.

    A couple of months ago we did a little stunt with the Trimble/TDS Nomad by actually taking it scubadiving. It was just in a pool, but it made for great video and underwater pics. I wanted to do the same with the Archer after we determined that it could do it, but the water was pretty cold by now and so we just dropped it into the pool. Juniper's most helpful Pat Trostle had told me how they often display the Archer in a fishtank at trade shows, but that they keep an eye on air bubbles which usually mean the thing is flooding. I've flooded a few underwater cameras in my day and know what Pat meant. So when bubbles emanated from the Archer upon being dropped into the pool, I felt a little burst of anxiety until I remember that, of course there will be some bubbles. They come from the air escaping the outside overmold and the plastic block that houses the interface jacks. No matter gets inside the core, of that I was sure. And it didn't. But it had to be taken apart and carefully cleaned and dried afterwards. Professionals would do that anyway, so no worries there. Saltwater may be a bit of an issue and I wonder if Juniper has data on the long-term effects of repeated contact with saltwater.

    Later, we did drop tests by carelessly swiping the Archer off a wall and down onto rather a rough driveway surface. We did that two or three times and I was afraid the unit would go back to Juniper with some good scratches. Amazingly, no scratches at all. That is impressive. Rugged device with exposed metal almost always scratch. Apparently not this one.

    Like many mobile computers, the Archer can be expanded in a number of ways, via a SD and a CF card slot. That way customers can use their own choice of expansion cards rather than being stuck with whatever is integrated into the unit. That's a good solution, and Juniper offers several extended caps that fit over such expansion cards. Amazingly, they claim that all of those expansion cards also provide the exact same IP67 ingress protection rating. That is a tall order. The way they do it is by separating the extension caps into two pieces. One is a precision-engineered adapter plate with a o-ring type of seal. The cap then screws on top of that. It works beautifully. But as anyone familiar with underwater housings knows, the o-ring approach depends totally on having immaculately maintained o-rings or sealing plates (which is what Juniper uses). Rings are, as far as I am concerned, easier to maintain as they can be replaced. The soft rubber sealing plate in our adapter was slightly deformed, and I wondered if it still sealed properly. I didn't want to risk flooding the machine and thus didn't put it to the test.

    It was an interesting experience, reviewing the Archer. It is fully up to the job and probably suitable for a far wider range of applications than Juniper currently pursues. But it also showed me again that design of professional equipment is only one part of the whole package. The other is the care the professional him/herself takes in working with, and maintaining, the equipment. These are tools for tough jobs, and good professionals always treat their tools with care and respect.

    Posted by conradb212 at 5:08 PM

    November 15, 2007

    Tests and reviews - how much punishment?

    I love rugged machinery, and so does everyone else here at RuggedPCReview.com. When a new machine comes in, everyone wants to see it, touch it, comment on in, and speculate how much abuse it can take. And this is where it gets interesting, the degree of abuse.

    Rugged machines are, by design, conceived and built to take a beating and survive. But the only way to know for sure if they indeed CAN take a beating is to administer one. And whether or not we should do that is a sensitive issue. A lot of this equipment is not inexpensive. So do we take a $4,000 computer, drop it, twist it, spill coffee on it, try to see if the screen is really scratch-proof and whether it's really water-proof? And then send back, at best, a severely banged-up machine, and at worst, one that is destroyed? Dvorak may get away with that and maybe some of the few remaining big print magazines, but I am not sure most eval unit coordinators would look upon such a reputation with great favor.

    That puts us in an interesting situation. We really think that rugged equipment should be just as rugged as manufacturers say it is, and sometimes we have doubts. We also see some stuff we are not very fond of. For example, glossy metallic surfaces that can and will get scratched in an instance simply should not be on a rugged machine, no matter how cool they look. But even there, do we just mention that in a review, or see just how badly it scratches (or not), document that, and then send it back?

    Most rugged machines come with ruggedness specs. MIL-STD results are listed and perhaps compliance with other testing procedures as they may vary from country to country. That can include inhouse testing and third-party independent tests in labs. Now I have seen many of those torture chambers -- the ones of Panasonic, GD-Itronix and Intermec, to name a few. I've seen machines being baked, shaken, rattled, dropped, scratched, exposed to extreme humidity, vibration, pressure, materials fatigue testing and more. The tests are real, and they certainly reveal weak points that are then addressed.

    Problem is that the reported testing results are not always very informative. MIL-STD testing means just that; a piece of equipment has been tested in accordance with the procedures mandated in a MIL-STD document. Often it is not reported what the outcome was, or if the machine even passed. Or only part of the test results are included in the specs. So prospective customers often do not have enough data to really compare. Some of the big companies in the field are guilty of not including truly meaningful ruggedness specs, and that doesn't do anyone a favor.

    Sometimes we do go beyond simply describing a machine and administer our own torture testing. When Trimble/TDS claimed their Nomad handheld was waterproof to the extent that it would survive for an hour in a full meter of water, we decided to see if that was really so. I made sure they were okay with that. We used scuba gear and actually took it for a dive. I used it underwater and pushed the specs. The Nomad went down to maybe seven feet, it stayed underwater for a good while, and it survived. It worked underwater and I even used it for handwriting reco underwater. It's all on video and up on YouTube.

    As a result, some manufacturers may be reluctant to send us their gear because -- hey -- those guys at RuggedPCReview may actually check the ruggedness specs for themselves. Others send us gear with the specific request to do so.

    A current example: Toshiba makes a remarkable machine, the R500 notebook. It is an ultra-light and definitely not fully rugged. But it has an awesome outdoor-viewable display and was designed to take the kind of punishment that may occur on the road. I think a Toshiba rep called it "executive-rugged". The R500's display case is very flexible, so much so that we had our doubts if it'd hold up to any abuse. Well, Toshiba explained it was designed that way, and there is even a video showing the machine take abuse and the LCD being twisted to a frightful extent, and survive. We're tempted to see if we can duplicate that, but should we? The last thing I want to do is send the R500 back with a busted display.

    For the most part, all this doesn't pose a dilemma. Most of the time the official test results are very clear and we see no reason to doubt them, nor would we have the ability to duplicate the torture testing. But the question does come up at times, and hence this column.

    What we would like to challenge the rugged industry to do is this: State all ruggedness specs fully and clearly enough so readers will know what exactly the machine passed, and, more importantly, what it means.

    Posted by conradb212 at 2:43 PM

    September 8, 2007

    Underwater computing?

    Underwater computing? Now that's a novel concept. For the past 15 years I've been dealing with rugged computing equipment, machines that can be dropped, survive in dusty environments, continue to operate whether it's scorching hot or really cold. They can also handle rain, though these days the trend seems to be surviving an accidental coffee or soda spill onto the keyboard. Sort of like cupholders in cars have become a make-or-break feature, second only to how many DVD screens for entertainment they have.

    Anyway, it's not unreasonable to expect computers come in contact with water. It covers 70% of the planet. People hang out around water. It rains. So we might expect a rugged handheld to continue to function if it is exposed to water. Why am I thinking of that? Well, maybe it's because I took up diving last year and since have been exposed to some pretty amazing equipment that does work underwater.

    For example, divers depend on dive computers. That's because diving subjects the human body to much higher pressure than it is subjected to on the surface. To counteract that pressure, the air a scuba diver breathes is also much denser. At a depth of 33 feet, for example, the pressure is twice that on the surface, and the air that is released from the scuba tank via the regulator is also twice as dense. That means that the partial pressure of nitrogen is twice as high, and according to William Henry's law, more nitrogen dissolves into body tissues. Once the diver comes up and the pressure lessens, that nitrogen is released from the tissues again. Normally it just goes into the bloodstream and is safely breathed out through the lungs. However, if the diver ascends too quickly, or if s/he has absorbed a large amount of nitrogen during a long, deep dive, the released nitrogen can form bubbles, and that can have dire, and at times deadly, consequences. Divers used to compute safe dive times on dive tables, and that is still being taught in scuba classes, but almost everyone uses a dive computer these days. Dive computers are sophisticated devices that continually measure depth and compute absorbed nitrogen. They show numerous values on their displays, tell the diver how much longer s/he can stay at a given depth, and when it is time to go up.

    Needless to say, dive computers must be totally and completely reliable. Failure is not an option. Leaking is not an option. Bugs are not an option. And wimpy battery life is not an option. And they must be able to handle not just a bit of splashing, not just a few minutes at three feet, but potentially hours at hundreds of feet. Without failing, ever. My dive computer has a wireless connection to my air tank so that it knows how much air I have left. After using the computer for a YEAR, the battery is still at 95%. Extreme "technical" diving may require very sophisticated dive computers to perform numerous life-supporting tasks at depths of many hundreds of feet. Sure, some look just like watches, and we're used to trust watches to survive swimming and snorkeling and a bit of diving. But many are larger -- sophisticated devices bigger than smartphones or PDAs, and with large displays and several controls.

    But it's not just dive computers. It is also cameras. As a reviewer of rugged mobile computing equipment I have an appreciation for one of the standards by which we judge a machine's ability to protect itself from dust and water, the Ingress Protection, or IP, rating. A handheld rated at IP56 is one tough machine and can likely survive in just about any environment. If, in addition, it can survive four foot drops, well, that is a very rugged device, and it probably looks like one, too.

    Diving exposed me to equipment that can do all that, and more. Case in point - a camera that Olympus makes. If you were to look at the Stylus 770 SW, you'd see a snazzy, handy little digital camera measuring 3.6 x 2.3 x 0.8 inches and weighing a bit over six ounces, battery included. It looks very elegant with a matte-silver finish. It has a bright 2.5-inch LCD display that's larger than those on most smartphones. The camera has about a dozen hardware controls, mostly pushbuttons, but also a navigation disk. There is a microphone and a speaker. What is special about it?

    It is rated IP58. It can survive 5-foot drops. It is crushproof. It can operate at 14 degrees Fahrenheit. And it can be operated in 33 feet of water.

    It does that without any protective case at all. No rubber bumpers, nothing. Just very intelligent design, meticulous manufacturing, and good sealing. It costs just over US$300.

    As a diver, I took that camera down to not only 33 feet, but 67 feet, and later 77 feet. It stayed underwater for a good hour. No problem at all. At the maximum depth I reached, the water pressure was so great that some of the push buttons were pushed in. And a small black rectangle showed up in the center of the LCD, from the water pressure. But it continued to take pictures.

    What those dive computers and cameras like the Olympus 770 SW show is that it is possible to create sophisticated electronic devices that can function underwater. I totally agree that there probably isn't a great need for handhelds you can take diving. Then again, some people out there might just like to have one. Most likely, we haven't even really started to think about possible applications.

    I am pretty sure military divers would make good use of an underwater rugged computer. And commercial divers would, too. Even recreational divers might just love to take a handheld underwater, or perhaps a tablet so they can write on it or doodle or draw. Divers communicate via hand signals mostly, and those are often misunderstood. As an alternate they write on little slates. A computer or electronic slate would certainly be much better. As I write this I am supposed to follow up on a new underwater texting technology -- texting like SMS on cellphones. My guess is whatever device is used for that must be rugged and quite waterproof.

    As is, we have a brand-new Trimble/TDS Nomad rugged handheld in our lab. It is a very tough handheld computer with an IP67 rating and thus was designed to survive immersion into water. We may put that to test test and record the performance on video. Simply don scuba gear and find a nice comfy spot somewhere at a depth of six or seven feet. Then see if it works. Without, of course, exceeding design specs. It's been pointed out that touch screens have not been designed to deal with water pressure and may thus fail to operate properly. At a depth of seven feet, the pressure on the touch screen would indeed be about 21% higher than on the surface, and this might make it inoperable, depending on design.

    Is rugged underwater computing on the horizon? Is there a need for it? Personally I think there is. There are practical applications. And besides, it is always interesting to see if something can be done. Hey, Olympus did it, with a vengeance.

    Posted by conradb212 at 11:12 PM

    July 31, 2007

    Marvell, not Intel

    I spend a lot of time updating the vast database of rugged devices listed and reviewed here at RuggedPCReview.com. Specs change all the time but the rugged and mobile computing industry is usually very modest when it comes to press releases and announcements. It's not like certain other fields where every new cellphone ringtone or executive promotion warrants a major PR campaign. So the way we go about it is making the rounds of all the companies, via their web sites, and check for updated specs.

    One thing I noticed is that even in updates, almost everyone continues to refer to the "Intel XScale" processor, the family of chips that power almost all Windows Mobile and Windows CE devices. Well, Intel doesn't make them anymore. They sold that business to a company named Marvell. Here's what happened:

    Marvell Technology Group -- a silicon solutions high tech firm based in Santa Clara, California -- decided to become a supplier in the cellphone and consumer electronics markets, and officially took over Intel's communications and applications processors in November of 2006. The original deal between Intel and Marvell was made in June of 2006 when Marvell agreed to buy the business from Intel for US$600 million. Under Intel's watch, the XScale PXA series were used in a wide variety of devices, with the PXA2xx used in Windows Mobile devices and the PXA9xx in such handhelds as the Blackberry 8700. Early on, Intel benefitted greatly from Microsoft's decision to switch Windows CE from a more or less open processor platform to mandating XScale. The deal between Intel and Marvell took several months to complete as Marvell had to find a manufacturer for the chips. Under the deal, Marvell took over the 3rd generation XScale processors, codenamed Monahan, and the 1.25GHz successor to the PXA27x Bulverde processors.

    Losing no time, in December 2006 Marvell launched the PXA 3xx series, consisting of the high-end PXA320 running at 806MHz, the cost-optimized low-end PXA 300, and the PXA310. The PXA300 and 310 run at clockspeeds up to 624MHz, with the 610 adding VGA playback. The PXA320 is able to scale from 806MHz to 624MHz to conserve power when full performance isn't needed. The chip is also more energy-eficient than the predecessor Bulverde processor, especially under heavy video and audio load. The PXA320 can run VGA resolution video at 30 frames per second, support a 5megapixel digital camera, video telephony, all at lower power consumption than the older XScale chips. The first products with the 806MHz PXA320 are now appearing, such as the recently released Trimble Nomad.

    From what we can tell, Marvell will continue to offer both the XScale PXA27x family as well as the older PXA255. But they are now Marvell chips, and no longer Intel chips. So let's do a global search and replace: It's Marvell XScale PXA and no longer Intel XScale PXA.

    The emergence of the PXA 3xx processors is exciting. More performance and more capabilities at lower power consumption. That's great. We can't wait to do hands-on reviews of the first Marvell XScale powered devices!

    Posted by conradb212 at 11:11 PM

    July 25, 2007

    The RuggedPCReview Blog launches

    Well, we finally added a blog section to RuggedPCReview.com. Yes, I know, everyone and their uncle has a blog these days, but I think it definitely makes sense to have one at a site like this where we are compiling information on just about every rugged mobile device out there. As is, our front page lists daily news and alerts readers to additions to the site, but often there is more than that. What's in a review is not always the whole story -- there's more to tell. Impressions, circumstances, interactions with PR people, engineers, product managers, testing, all the stuff that generally does not go into a review. That's one thing.

    Another is that we tend to have our own opinions on matters. Be it new developments in the field, new product, company acquisitions, mergers, or consolidations. Anything that affects the rugged industry landscape. Or promising new technologies, and how we see them fitting into rugged computing. Sometimes we do factory visits and those are always fascinating and provide new insights.

    Other times we have gripes, or come across stuff that simply doesn't make sense. So we may wonder, "What were they thinking?!?" and contemplate that. Or we may be presumptuous enough to offer commentary and recommendations, or share our views on developments.

    Finally, we go to shows. We see new stuff. We talk to people. All that will go in here. And it won't just be us guys here at RuggedPCReview.com doing all the commenting and blogging. No, we'll invite guest bloggers to share their views and insights, so that you'll get as broad a cross section on opinions as possible. So if you have something to say or contribute, let us know via email to cb@ruggedpcreview.com!

    Posted by conradb212 at 11:11 PM

  •