Industry sponsors:
HOME | NOTEBOOKS | Tablets | Handhelds | Panels | Embedded | Rugged Definitions | Testing | Tech primers | Industry leaders | About us
Sponsors: Advantech | Dell Rugged | Getac | Handheld Group | Juniper Systems | MobileDemand
Sponsors: Motion Computing | Samwell Ruggedbook | Trimble | Winmate | Xplore Technologies

September 13, 2017

The impact of iPhones on the rugged handheld market

Apple has been selling well over 200 million iPhones annually for the past several years. This affects the rugged handheld market both directly and indirectly. On the positive side, the iPhone brought universal acceptance of smartphones. That accelerated acceptance of handheld computing platforms in numerous industries and opened new applications and markets to makers of rugged handhelds. On the not so positive side, many of those sales opportunities didn't go to providers of rugged handhelds. Instead, they were filled by standard iPhones. There are many examples where aging rugged handhelds were replaced by iPhones, sometimes by the tens of thousands. That happened despite the relatively high cost of iPhones and despite their inherent fragility.

By now it's probably fair to say that the rugged handheld industry has only peripherally benefitted from the vast opportunity created by the iPhone's paving the way for handheld computers. Why did this happen? Why did it happen despite the fact that iPhones usually don't survive a single drop to the ground without damage, despite that fact that only recently have iPhones become spill-resistant, despite the fact that iPhones need a bulky case to survive on the job, and despite the fact that their very design -- fragile, slender, gleaming -- is singularly unsuited for work on the shop floor and in the field?

One reason, of course, is that Apple is Apple. iPhones are very high quality devices with state-of-the-art technology. Apple has universal reach, excellent marketing and packaging, and massive software developer support. And despite their price, iPhones are generally less expensive than most vertical market rugged handhelds. Another reason is that creating a rugged handheld that comes even close to the capabilities of a modern consumer smartphone is almost impossible. Compared to even the larger rugged handheld manufacturers, Apple is simply operating on another level. The combined annual sales of all makers of rugged handhelds, tablets and laptops combined is only about what Apple alone sells in just over a week.

All that said, what can the rugged handheld market learn from Apple? Actually quite a bit.

Take cameras for example. The iPhone has almost singlehandedly obliterated the market for consumer cameras. People take pictures with their iPhones today, not with dedicated cameras. That's not only because it's convenient, it's also because iPhone cameras are quite excellent. They are state-of-the-art and have massive software support. Sending pictures from an iPhone to another phone, a computer, social media or anywhere else is easy. iPhone pictures and videos can easily be viewed on a big screen TV via AirPlay.

How important are cameras in smartphones? Important enough that the iPhone 7 and 7 Plus became hugely successful despite only modest overall improvements over the prior iPhone 6 and 6 Plus. The iPhone 7 Plus came with two cameras that seamlessly worked together, there was an amazingly successful "portrait" mode, and overall picture taking with an iPhone 7 Plus became so good that here at RuggedPCReview.com we switched all product photography and video from conventional dedicated cameras to the iPhone 7 Plus.

The same is again happening with the new iPhone 8 and 8 Plus. Relatively minor improvements overall, but another big step forward with the cameras. Both of the iPhone 8 Plus cameras have optical image stabilization, less noise, augmented reality features thanks to innovative use of sensors, stunning portrait lighting that can be used in many ways, and also 4k video at 60 frames per second, on top of full HD 1920x1080 video in 240 frame per second slow motion. And the new iPhone X ups the ante even more with face ID that adds IR camera capabilities and sophisticated 3D processing.

Is this all just for fun and play? Oh no, it goes way beyond that. Imaging has become essential in today's world, both on the personal and the business and enterprise level, and beyond. People document and record just about anything with their smartphones today, and that's made possible both by superior quality of those smartphone cameras as well as by the sheer convenience of it. People take pictures of anything and everything today. Good, useful, high quality pictures and video.

Can that be done with rugged handhelds as well? After all, the ability to quickly and easily document things on the job is becoming massively important. Sadly, the answer is no. Yes, compared to the truly dreadful cameras that used to be built into rugged mobile computing gear, the situation has improved substantially. But overall, even the best cameras in rugged gear lag way behind almost any modern smartphone. And the average cameras in rugged gear are way, way, way behind anything on a phone.

Just how critical is the situation? Very. A shocking number of mobile Windows devices simply use Microsoft's basic camera app that has virtually no features at all and is very rarely matched to the camera hardware. That means you get blurry low-res pictures from camera hardware that seem, according to their specs, capable of significantly more. More often than not, there are barely any settings at all, no white balance, no manual adjustments, no smart focussing, image aspect ratios do not match display aspect rations, there's huge shutter lag in stills and deadly slow frame rates in video. As a result, even the best integrated documentation cameras are barely good enough to generate passable imagery even after training. That is a very regrettable oversight and it's really a shame.

Then there's the hardware. Here again the iPhone, and most consumer phones, are lightyears ahead of just about anything available in vertical markets. No one expects low-volume vertical market gear to match the likes of Apple and Samsung in advanced miniaturized technology, but the gap really should be much closer than it is. Especially given that even consumer electronics leaders don't push the envelope when it's not really needed. For example, the last three generations of iPhones have all used the same 12mp imager size, but still been able to push camera capabilities forward with every generation. Display resolutions, likewise, haven't changed from iPhone 6 to 7 and now 8. But that's because even the iPhone 6 Plus of three years ago already had a full HD 1920 x 1080 5.5 inch screen. Rugged handhelds, on the other hand, generally offer much lower resolution.

There's no need for specialized vertical market technology to always be at the consumer technology state-of-the-art. But it must not be so far behind as to impact the usefulness of the products.

With the new iPhone X, Apple again pushes the envelope. The new phone uses an OLED screen instead of a standard LCD. OLED (organic light-emitting diod) screens don't need backlights because the individual pixels emit light. That makes for deeper blacks and more vibrant color. The screen measures 5.8 inches diagonally, but since it covers the entire front of the device, the footprint of the iPhone X is much smaller than that of the iPhone 8 Plus with its 5.5-inch screen. Resolution is 2436 x 1175 pixels, which makes for a razor-sharp 458 pixels per inch.

The iPhone X is expensive ($999 for a 64GB version and $1,149 for a 256GB model) and will likely be a standard bearer rather than Apple's volume leader for some time to come. But its advanced technology will impact expectation and make lesser technology look old. And that can make or break sales. So by introducing their latest iPhones, Apple not only enforced the already high expectations customers have of a modern handheld with the refined iPhone 8 and 8 Plus, but boosted expectations with the iPhone X whose technologies will soon be considered the norm.

What does all that mean to the rugged handheld market? Most likely that things will get tougher yet. Is there still merit in building handhelds rugged from the inside out? Definitely. But waterproof consumer smartphones and the availability of some really good protective cases have raised the ruggedness ante. Is being good enough technologywise still, well, good enough? Maybe, but probably not. Consumers are smart and their expectations increasingly dictate what makes the cut and what doesn't, even at work.

And that is the impact of the new iPhones on rugged handhelds. Everything looks a bit older yet, and expectations have been raised yet again.

So what are my recommendations to the rugged handheld industry?

The good news is that the iPhone is still your friend. It continues to open markets and applications to handheld computers where none were ever considered before. And some of those new markets and applications may go to rugged gear. So the opportunity is still there, and it's perhaps greater than ever.

To realize that opportunity, however, some things need to happen. The rugged handheld industry must:

-- Adopt contemporary technology. It doesn't have to be state-of-the-art, but its must be close enough so as not be viewed as simply old.

-- Increase display resolution. One cannot claim to have professional tools when the professionals who use those tools expect, and need, more.

-- Pay attention to performance. A lot of the generic chips we see in use today are barely up to the job.

-- Do not disregard consumer tastes. If certain form factors, materials and sizes are appealing, use them.

-- Remember that there's one reason why customers pay more for rugged gear: ruggedness. Make sure it truly is. Have all external ruggedness testing done and available. Excel in this. If a consumer phone offers IP67, IP65 is no longer good enough.

-- Make ruggedness testing count. Explain what it means and how it benefits customers. No more generic "MIL-STD-810G compliant" claims without specifics.

-- Don't phone it in. Great cameras are part of the success of consumer phones. Make them great in rugged handhelds, too. I consider that crucial.

-- Hang on to, and improve, traditional rugged gear strengths. Outdoor viewability is one. Rain and glove capability is another. Scratch resistance, industrial-grade scanners, strong backlights, etc., are others. Make all that even better.

-- Clean up the software. An Android or Windows home screen with games, social media and entertainment doesn't impress enterprise customers. Stress functionality and security instead.

-- Drop the hype: If there are are extra business or enterprise features, explain them clearly and outline their real world benefits and advantages.

So yes, the new iPhones have once again raised the bar, and thus made it that much more difficult for rugged handheld computers to measure up in many customers' eyes. But I am convinced that, if executed properly, there remains great opportunity for the rugged handheld industry. -- Conrad H. Blickenstorfer, September 2017

Posted by conradb212 at 08:44 PM | Comments (0)

July 31, 2017

A future where quality is king -- A look at Zebra's 2017 Manufacturing Vision Study

On July 31st, 2017, Zebra Technologies Corporation published the results of their 2017 Manufacturing Vision Study on emerging trends that are shaping the future of industrial manufacturing. The broad result of the global study suggests that manufacturers are adopting IIoT (Industrial Internet of Things) and Industry 4.0 concepts to get better insights and information about their manufacturing process and, most importantly, to improve quality.

Why Zebra knows

Why should such a study come from Zebra? Isn't Zebra in the mobile printer business? They are, but not only that. Zebra's been around for almost half a century, having started as "Data Specialties Incorporated" back in 1969. They initially made a variety of electromechanical products but soon focused on labeling and ticketing systems. The name changed to Zebra in 1986. A dozen or so strategic acquisitions followed and by 2013 Zebra cracked a billion in sales. And then more than tripled that in one fell swoop by acquiring Motorola Solutions' enterprise business in 2014.

Why do I mention all that? Because Motorola Solutions' enterprise business mostly consisted of the former Symbol Technologies, a pioneer in retail and inventory management bar code scanning systems. In my capacity as Editor-in-Chief of Pen Computing Magazine I visited Symbol's headquarters on Long Island several times before its acquisition by Motorola in 2007. On each visit I was impressed with not only Symbol's scanning and mobile computing technology, but also by the importance they placed on exploring fresh ideas and innovative concepts to push the state-of-the-art in data collection to new levels and in new directions. And by how in tune they were with emerging trends and directions.

Despite Symbol going through rough times in the early 2000s, that innovative spirit never flagged, and that didn't change under Motorola nor under yet another big change when Zebra came along, and it's apparently still there. I don't know how many of the same people are still around, but that spirit of innovation, of dreaming up new concepts and uses, of trying new things, of viewing great hardware as just one part of making business better, that's still there. And it's at the heart of those vision studies that Zebra's been generating.

Vision studies

These vision studies are, truth be told, not super-easy to read and comprehend. On the surface, the studies are surveys of how customers use technology, how they view new developments, and what their plans are for the future.

But Zebra also provides considerable value-added by presenting the survey results with commentary and in the context of emerging trends and directions. That includes not only the obvious, the Internet of Things, but market-specifics, like the Industrial Internet of Things (IIoT), and Industry 4.0 where intelligent networked systems go beyond mere machine control via feedback systems, interdisciplinary cooperation, IoT technology, and advanced resource/demand management to morph into "intelligent" or at least "smart" factories.

Zebra doesn't even stop there. The value-added of the vision studies also includes relating their survey findings to emerging trends, mindsets, and revelations, and how technology can get customers from here to there.

In this latest vision study entitled "Quality Drives a Smarter Plant Floor: 2017 Manufacturing Vision Study," Zebra highlights a major shift and transformation, that from cost-cutting no matter what to quality first.

Exclusive focus on ROI is out. Focus on quality is in

In essence, what's happening is that almost everyone is realizing that cost-cutting in an effort to boost short-term return on investment has been a disaster. That's because while it's certainly a good idea to eliminate waste, far too often cost-cutting has led to loss of quality. The whole concept of quality is extremely multifaceted, but it's almost inevitable that cutting cost just to please investors is quite short-sighted and will almost inevitably lead to lowered quality, and lowered quality will inevitably frustrate and anger customers, no matter how loudly they asked for low prices.

The result of realizing the dangers of exclusive focus on cost cutting to improve ROI is that quality is king again. But wait, don't you inevitably get what you pay for? If manufacturers used to cut costs to remain profitable even as quality suffered, won't a new emphasis on quality increase costs and raise prices?

The price of quality would indeed be higher prices if manufacturers continued business as usual. But Zebra's survey of 1,100 executives from automotive, high tech, food, beverage, tobacco and pharmaceutical companies showed considerable optimism about the positive impact of technology on both quality and revenue. The number of fully-connected smart factories will double in the next five years, the use of emerging technologies will rise, and respondents felt very positive on the impact of technology and automation. And Zebra cites a study by the American Society for Quality (ASQ) that claims that for every dollar spent on a quality monitoring system, companies could expect to see an additional $6 in revenue, a $16 reduction in costs and a $3 increase in profits.

How to improve quality and ROI

But how, exactly, do they expect to get from here to there? From the dead-end of the cutting costs/lowered quality cycle to solid quality and growth?

In essence by using emerging technology, IIoT and Industry 4.0 concepts to keep better track of both the supply chain and of assembly lines.

As is, Zebra points out that there's no real-time communication between supply chain and production lines, resulting in too little inventory (slowing production) or too much of it (increasing costs).

And while 27% of those surveyed by Zebra are collecting data from production lines, supply chains and workers, that data sits in some database or spreadsheet instead of being shared, leaving the true value of the intelligence untapped. In addition, tracking points are few and far between.

While most manufacturers already track production, real-time monitoring is usually limited to just a few checkpoints, or so called "gates." Visibility over the entire supply and production process can be vastly improved by adding more connected gates that use auto ID technology to provide real-time location, material allocation and asset condition at every critical juncture. This data can then be used to eliminate bottlenecks, communicate with suppliers, optimize just-in-time shipment, and ensure quality. And additional gates can help meet the ever more important demand fro increased product variants.

Towards quality and visibility

What are the components needed to make all of this happen? There's "the Cloud," of course, with its various levels and services, ranging from simple sensors and gateways all the way up to the various enterprise IoT offerings with all of their services and capabilities.

What Zebra contributes here is the subset of IoT that the company calls "Enterprise Asset Intelligence." That includes the company's impressive roster of wearable and mobile technology, various methods of data capture and identification, voice recognition, smart IDs, as well as enabling hardware/software solutions such as Zebra's SmartLens for Retail or a variety of purpose-built manufacturing solutions that cover every aspect of the operation.

Tracking assets from start to end is key to optimal, consistent quality. That makes real-time location systems (RTLS) in the manufacturing environment an interesting and mandatory proposition. Collected data along every step of the production process can be used not only for quality control purposes. It can also be used to communicate directly from the factory floor to suppliers so they can quickly react, keeping supply and inventory at optimal levels. Zebra calls that "Sense, Analyze and Act."

Moving past strategy and into deployment

What it all amounts to is that with its 2017 Manufacturing Vision Study, Zebra provides not only information how over a thousand manufacturing industry executives see the current and future use of data capture technology of their operations. Zebra also offers a blueprint of how to move from a strictly ROI model to embarking on a future where simple cost-cutting is replaced by greatly enhanced visibility over both supply chains and production lines via a finer mesh of intelligent gates, optimizing both quality and cost in the process. Zebra feels that it is ahead of the curve in this, and one of the few players than has moved past strategy into real world deployments that will have profound effects on all their customers in the coming years.


Zebra's 2017 Manufacturing Vision Study

Zebra Wearables on RuggedPCReview: WT6000
Zebra Wearables on RuggedPCReview: WT41N0
Zebra Touch Computers on RuggedPCReviewr: TC5
Zebra Touch Computers on RuggedPCReview: TC70x
Zebra Touch Computers on RuggedPCReview: TC8000

Posted by conradb212 at 05:11 PM | Comments (0)

July 27, 2017

GammaTech celebrates its 30th anniversary

GammaTech Computer Corporation is celebrating its 30th anniversary this month, July 2017. That's amazing longevity in an industry where big names come and go. And it marks GammaTech as one of the pioneers in an industry and technology that truly changed the world.

It's hard to believe that it's been 40 years since the Apple II rang in the era of personal computers, seen first just as expensive toys for nerds and hobbyists, but then, four years later, legitimized by the IBM PC.

The 1980s were the Wild West era of personal computers. PC trade shows drew huge crowds. Trade magazines had many hundreds of pages every issue. Everyone wanted in on the action.

Taiwan early on established itself as a major player in the OEM industry, OEM standing for Original Equipment Manufacturer, companies that actually make the products sold by another company under a different name. One such company in Taiwan was Twinhead International Corporation, established in 1984. Initially a domestic maker of PCs and peripherals, Twinhead soon branched into global markets, setting up subsidiaries in the US, Germany, France, and the UK, as well as distributors in dozens of other countries.

Twinhead USA, now known as GammaTech, was the first such international branch and became instrumental in distributing a succession of interesting, innovative Twinhead Slimnote, Superlap, Supernote and Efio! brand laptops.

Times, however, changed. PCs became a commodity in a market increasingly dominated by a few large companies. Even IBM dropped out of the PC business, and other major players began concentrating on niche markets. Just as Panasonic launched the Toughbook and built that brand, Twinhead turned its focus to industrial and application-specific and mission-critical systems and launched the Durabook brand of rugged notebooks and tablets.

And when did the Durabook brand get started? It happened a little bit like with Panasonic whose rugged notebooks existed before the Toughbook name was introduced. Twinhead, likewise, launched their first military-grade rugged notebook, the N1400, around the turn of the millennium, but the Durabook brand itself began appearing in 2002.

In our 2003 Comdex coverage, we reported that "Twinhead announced their latest series of semi-rugged notebooks that may prove to be tough competition for Panasonic." We had ample opportunity to examine Twinhead quality and ingenuity firsthand in products the company created for and with longtime partner and customer Itronix. An example was the Itronix GoBook Duo-Touch where our reaction was "Wow. They really nailed it this time." And the Itronix GoBook II — which started life at Twinhead's manufacturing plants before they were sent for final customer configuration at Itronix in Washington — became Pen Computing Magazine's Editor's Choice in the "high performance rugged notenbook."

GammaTech displayed a show of strength again a couple of years later in the 2005 Pen Computing Magazine's Editor's Choice Awards where Durabooks battled Panasonic Toughbooks to a draw with two awards each.

That same year, RuggedPCReview reviewed the Durabook N14RA. In the review we mentioned Twinhead's initial Durabook "Slim, mobile, ruggedized, and affordable" slogan, and that the N14RA certainly fit that bill. We liked not only the design and toughness of the N14RA laptop, but also praised Twinhead for a "masterful job selecting materials and textures". We also explained how the company had very productive relationships with major vertical market leaders such as Itronix, and thus has gained substantial expertise in the design and manufacturing of durable, ruggedized, and fully rugged mobile computing equipment.

Over the years, we have examined and reported on GammaTech Durabooks dozens of times. Around 2010-2012 we found Twinhead's N-Series of magnesium-bodied notebook computers "very solid and trust-inspiring." We were impressed with the Durabook U12C and R13S convertible notebooks that provided the same functionality as the market leaders, but at a significantly lower price.

More recently, we applauded GammaTech making available the DurabookK R8300, an updated version of the Twinhead-sourced General Dynamics Itronix GD8200. This was a terrific fully-rugged laptop that appeared to vanish when General Dynamics closed down Itronix. But GammaTech brought it back better than ever.

We did detailed reviews of GammaTech's various rugged tablets and were particularly impressed with the still available Durabook R11: "to say that the GammaTech Durabook R11 is impressive would be an understatement." GammaTech then followed up with an updated R11 tablet in conjunction with a well-conceived keyboard that made for a very useful hybrid 2-in-1 ("A well conceived, well executed solution without any of the flimsiness and stability issues of most add-on keyboards"):



Likewise, we were impressed with the updated Durabook SA14 semi-rugged laptop and called it "a good deal for anyone who needs a high-performance, highly configurable notebook that is significantly tougher than standard consumer laptops and should hold up well in daily use".


Overall, GammaTech could serve as a case study both of longevity based on exemplary quality and dedication, but also of the challenges of splitting business between OEM/ODM and one's own brands.

Given the company's experience in rugged markets and the remarkable quality and price points of its products, one would expect to find GammaTech among the rugged computing market leaders. But those spots are taken by Panasonic, with Getac and Dell battling for second place. The closing down of General Dynamics Itronix probably was a big blow for Twinhead. Another issue might have been product overlap as a result of OEM customers requesting similar but not identical models that could cause confusion when appearing under the Durabook name.

Now, celebrating its 30th birthday, GammaTech sports a trim, focused product lineup that includes the SA14 and S15AB semi-rugged laptops, the fully rugged R8300 laptop, the rugged R11 tablet that can also be used as a 2-in-1, the military-style 10-inch V10BB fixed-mount tablet/panel, and the multi-purpose 24-inch Durabook P24 workstation. Between the company's long established experience and track record, the Durabook brand equity, and a fine-tuned product roster that's not only technologically up-to-date but also attractively priced, the pieces are in place for GammaTech to set its sights on gaining marketshare.

So congratulations, GammaTech, on your 30th anniversary, and for contributing so much to making computers durable, tough and rugged enough to hold up out there on the job and in the field. And for keeping them affordable. Best of luck for the future. You have much going for yourselves.

Posted by conradb212 at 11:09 PM | Comments (0)

June 16, 2017

Apple Watch Series 2 after three weeks

It's been three weeks since I finally gave in and bought a Series 2 Apple Watch. While the Apple Watch isn't a rugged device and thus not something we'd normally report on, it is a wearable device and wearable computing power is playing an increasingly prominent role. That's because unlike even handhelds, a wrist-mount is always there, always handy and it doesn't need to be stowed away.

So what's been my experience over the first three weeks with the watch? The good news: I am still wearing it. That could be because I wore wrist watches for decades (albeit not in a good many years) and am therefore used to the concept of wearing something on my wrist. It could also be that I found enough to like on the Apple Watch to keep wearing it. Most likely it's a combination of the two.

So here are some of my impressions, starting with the good:

Battery life — Battery life is better than I expected. With the way I've been using the watch I get at least two days of wearing it (including nights) and sometimes three. That's much less than a regular watch whose battery lasts years, but the Apple Watch likely has far more computing power than Apollo 13 had onboard when it went to the moon. And the Apple Watch charges quickly. An hour and a half and it's fully recharged.

Great watch — It makes a great watch. The Apple Watch can display an endless variety of watch faces. I use is a simple, elegant white-on-black analog one. The display is bright enough even in full sunshine. I configured the watch face so that it also displays the date/day of the week, remaining battery charge, weather, and the messages icon.

Messaging, weather and phone — I am surprised by how much I like having the weather info on the watch. I like knowing what the temperature is outside and whether it's sunny, cloudy or raining. I also like having my messaging app right on my wrist. This way I see incoming texts instantly, I can scroll through texts, and I can answer with quick canned responses or even brief messages (there's a letter recognizer function).

Knowing not only when I get a call or a text, but also being able to answer it right on the watch is great. Yes, that's what the iPhone is for, but given how large my iPhone 7 Plus is, I don't always have it on me. As long as my iPhone is within about 100 feet of the watch, there's a Bluetooth connection, and the watch has WiFi, too (albeit only the 2.4GHz band), so as long as my iPhone is on a network, the watch can be, too.

Using the watch as a phone works remarkably well. The sound is amazingly good. But whenever I use the watch as a phone I wonder why there isn't also a camera built into the phone.

Heart rate sensor — The iPhone's heart rate sensor is quite informative. For many years I thought my resting heart rate was around 60 and I really didn't know what it was when I was running. Thanks to a Sleep app on the Apple Watch I now know that my resting heart rate is more like 50. And thanks to my running app I now know my heart rate when I am exercising.

The sensors in the Apple Watch tell the watch how and when I am moving. That way the watch knows when to turn its display on for me to tell the time, and when I am laying down or sleeping. The Sleep app uses that information to show me whether and when I am restless during my sleep.

Onboard GPS — As stated, I bought the Apple Watch primarily because I didn't want to hold my iPhone in my hand or have it in a pocket when I go running. The original Apple Watch didn't have its own GPS and so relied on the iPhone for collecting positioning date. The Series 2 watch does have GPS and can record geographic information by itself. And since my preferred running app does have an Apple Watch app, I was looking forward to leaving the phone at home when running.

Initially, that didn't work. And I still haven't been able to get my favorite running app to reliably record runs. So I tried a different one (Nike's NRC+) and that one works just fine. The app records distance, speed, heart rate and route. Once a run is done, it passes the run info on to the iPhone. So that's what I have been using on my runs. I did, however, not like switching from my favorite running app one bit. Just as I wouldn't like to have to switch from my favorite wordprocessor to another one.

So that's what I have primarily been using the Apple Watch for. If that doesn't sound like much, a) I'll doubtlessly discover other cool things, and b) it's no different on my computer where 98% of the time I use the same four or five software applications.


But not everything has been great:

Battery life — Battery life, while better than expected, is still a limiting factor. With the Apple Watch, I now have yet another piece of electronics that I have to remember to charge. Which means another charger and another cable, and something else to lose or forget or damage.

No truly compelling apps — I am still looking for the "killer app" that would truly justify the Apple Watch and make it indispensable. In the past most people absolutely needed a watch to tell time. Now we use smartphones for that. It has to be something that the watch does inherently better than the phone, and I haven't found anything like that.

A couple of years ago, Apple said there were already 8,500 Apple Watch apps. Today that number is likely much higher. Unfortunately, the majority of watch apps I tried are really quite useless. Not all as some may address specific needs, like tidbits of information that are handy to have without reaching for the phone. But for the most part, whatever the watch can provide on its tiny screen, the iPhone does much better on its larger screen.

Clumsy interface — It doesn't help that the Apple Watch interface and controls are marginal. The mosaic of tiny round icons without any labeling is initially bewildering. I actually got used to it better than I expected, but it's definitely not a perfect solution, especially when one has a large number of apps on the watch. Apple itself seems to realize that as the next rev of the Watch OS will offer a simple alphabetical list of apps as an alternate.

As far as hardware controls go, the crown and pushbutton are far from self-explanatory. Use the push button to bring up the dock, then rotate the crown up and down to go left and right on the dock? Not great. Apps may require a tap, a double-tap or a hard push to do things, but you never know.

Disturbing Watch issue — Then there are other issues. After my wife recently returned from her vacation she found, much to her dismay, that the display of her own original Apple Watch had popped open! The battery inside was visibly swollen. Not good at all. Googling this disturbing situation revealed that bloating first-gen Apple Watch batteries are well documented, though 9to5Mac.com stated that the problem "appears to not be widespread or something that has made mainstream media headlines."

Apparently the issue is serious enough for Apple to extend the service coverage for swollen first gen Apple Watch batteries to three years. We made an appointment with the local Apple store where a rather disinterested and not very helpful "genius" said he'd only ever heard of one other such problem. Apple would replace my wife's watch. Given that her faith in the watch is shaken, my wife, who was visibly appalled at the poor Apple Store experience, asked if she could pay for an upgrade to a Series 2 watch. No can do, said the genius.

So the verdict on the Apple Watch remains inconclusive. Stay tuned.

Posted by conradb212 at 08:09 PM | Comments (0)

May 30, 2017

Initial impressions of an Apple Watch holdout

So I finally got an Apple Watch. Series 2, the one that's waterproof and has onboard GPS. I chose the larger one with the 42mm screen. When you need reading glasses every millimeter counts. That meant a $400 investment for a bottom-of-the-line watch with the aluminum case (as compared to the much more expensive stainless steel and ceramic ones). I picked space-gray with a black Nike sport band.

What took me so long? I am not sure. As a native Swiss I love watches and there were times in my life where I collected them. And I love Apple. When Apple has something new, I almost always buy it right away. So why not the Apple Watch when it first came out?

Overall, a funny thing happened to watches. When I grew up and for most of my earlier life, everyone wore a watch. A watch was part of life. But that has changed. I see comparatively few people wearing watches today. And if they do, the watch seems more a fashion accessory or a status symbol than something that tells time, which is really all that watches do. Fact is, smartphones have replaced watches. And cars and almost any other gizmo. They all tell time. Sure, it takes a bit longer to reach for the smartphone than to simply glance at your wrist to tell time, but we have gotten used to it.

That still doesn't explain why I didn't get an Apple Watch when it first came out. Between loving watches, loving Apple and loving technology, I should have but I didn't. And it's not that I have some inherent issues with watches that tell more than time. I bought the Microsoft watch almost a quarter of a century ago. I had numerous Casio and Timex watches that did all sorts of things.

So why not the Apple Watch? I am not quite sure. I recall being underwhelmed after watching Apple's initial introduction of the watch. It looked sort of blah instead of new and supercool. It couldn't really do anything that I felt I really needed. No Dick Tracy functionality there. And not even the promise of video calls that I'd seen in a Panasonic concept 15 years ago (see image to the right). Worst of all, the Apple Watch needed the iPhone to do much of anything. Why get something that makes you carry something else around? So I passed.

My wife finally got a Series 1 Apple Watch about a year after its April 2015 introduction. She was excited when she found that she could use the Apple Watch as a phone around the house. That meant there was no need to always have the iPhone with her, or trying to remember where it was. But using the Apple Watch as a phone quickly drains the battery. Apple claims up to 3 hours talk time. My wife is on the phone quite a bit, and the Apple Watch battery often didn't even make it past noon. She concluded that apart from being a conversation piece, the Apple Watch really couldn't do anything for her and so she stopped wearing it.

Then Apple introduced the Series 2 of the watch. Now the Apple Watch was waterproof, twice as bright, and had onboard GPS. That caught my attention. Waterproof to 50 meters (164 feet) and a brighter screen made it suitable for outdoor adventures. And onboard GPS meant you wouldn't have to take your iPhone along when jogging or running.

Waterproof to 50 meters ****

My interest cooled significantly when I found that Apple's idea of waterproof to 50 meters is very different from mine. To me it meant theoretically being able to take the watch on scuba, even past the recreational limit of 40 meters. To Apple it means the watch "may be used for shallow-water activities like swimming in a pool or ocean. It is also safe to wear it while showering or in a hot tub. However, Apple Watch Series 2 should not be used for scuba diving, waterskiing, or other activities involving high-velocity water or submersion below shallow depth." Truth be told, just like I never stuck my expensive iPhone in a waterproof case to take pictures and video while scuba diving, I wouldn't take a $400 Apple Watch underwater. But Apple's bombastic claim just rubbed me the wrong way.

So that left the onboard GPS. And that was what, in a spur-of-the-moment decision, made me finally get an Apple Watch. Getting a jogging GPS watch had been brewing in my mind for a while because I am a runner. Doing a good, exhausting run three times a week works for me and I've been doing it for many years. On top, I frequently run 5Ks. I keep track of my runs and performance with a dedicated running app on my iPhone. And have been holding the big iPhone 7 Plus in my right hand on many hundreds of runs. Not optimal.

Awesome packaging

Getting any Apple product is exciting. Apple knows how to package and present. The black box the Apple Watch comes in is over a foot long and, empty, weighs a pound. That's a bit more than an iPad Air 2. Just for the box. And the Apple Watch is really quite attractive in a minimalistic way, sort of like a shrunken iPhone. The semi-matte anodized space-gray finish looked terrific. Instructions are minimal, too, as they are with all Apple products. After all, Apple products are so simple to use that instructions aren't needed.

Except that with the Apple Watch they are. Needed, that is.

That's because trying to figure out the relationship between the iPhone and the Apple Watch is more like working with iTunes (widely known as Apple's most cryptic, befuddling and recalcitrant piece of software).

This Apple product does need instructions

Apple's been working to portray the watch as more than just an adjunct to the iPhone. Onboard GPS and more apps that don't need the iPhone to run are a good start, but fact is that the Apple Watch remains almost fully dependent on the mothership iPhone for almost everything that goes beyond just being a watch.

That means becoming very familiar with the iPhone's Watch app that's used to set and configure and manage the watch. In essence, the iPhone acts like a server to the Apple Watch, or like the Cloud to a Chromebook. The watch can do some things, but it doesn't have its own internet access, and what can one do these days without internet access?

Unfortunately, very little is obvious on the Apple Watch and the watch-related software on the iPhone. It does start innocuous enough, with browsing various watch faces which, on top of basic watch functionality, can accommodate up to four additional functions, like showing battery charge, date, weather or icons to bring up favorite apps.

Complications???

What comes next is more complex and getting it right will make or break how useful the Apple Watch becomes.

It starts with "Complications". In Apple Watch lingo "complications" doesn't mean difficulties or being confronted with unexpected problems. Instead, the word refers to additional functions that can be shown on the watch face. Apparently, Apple chose the word because old-style Swiss watchmakers used it to describe anything that made the basic clockwork of one of their masterpieces more complicated. Using it on the Apple Watch is, well, absurd.

Next comes figuring out what sort of apps that aren't "complications" should also run on your Apple Watch. As is, some but not all of the apps resident on your iPhone have Apple Watch apps, and those all show up as installable. If you are not super-careful, they will all be installed, which means you'll end up with hundreds of tiny icons on your Apple Watch. You won't know beforehand what those apps will do or what they look like. If you tap it on the iPhone's Watch app, it will be installed. No explanations. Tapping again uninstalls it, but this seems a very inadequate method of figuring out what should occupy valuable real estate on the Apple Watch's limited memory and even more limited display real estate.

But that's not all. For each of the apps you should then set what sort of notifications you want to receive on the Apple Watch. That's similar as on the iPhone where it's all too easy to be peppered with endless notifications all day long. Now THAT's a complication.

The iPhone Watch app also lets you play with the layout of all the icons that represent apps installed on the watch. This actually works a lot better than arranging icons on the iPhone itself. You can arrange the little watch icons any which way easily and we've seen some innovative layouts.

Finally, the Apple Watch also has a software dock. The dock serves the same purpose as the icon dock on a Mac or the fixed apps dock on the bottom of an iOS device: those are apps you want to access quickly without having to first locate them on some other screen or place.

As far as user interface goes, swiping left and right brings up different watch faces that can include different "complications." That can come in handy if you use the watch for different purposes. For example, you could have a workout watch layout that shows heart rate and workout data, one that's for travel with time zones and airline app "complications," or one that's pretty and shows pictures from a special folder.

Swiping up shows the Control Center where you can set airplane mode and other modes (silent, do not disturb, theater, ditching water from the speaker after swimming, and more). Swiping down shows notifications that have come in. Some apps have "force touch" that bring up app-specific controls if you press hard. Pushing the crown toggles between watch face and icons, and press and hold brings up Siri. Turning the crown scrolls. A side button brings up the app dock. There's no multi-touch, but with a bit of practice, the interface is fairly easy to remember.

What is the watch doing? What does it need? How? When? Where?

Where it gets difficult and often quite frustrating is what the watch doesn't tell you. Simply put, the Apple Watch does things, but it almost always doesn't tell you what and how. With no pop-up windows to tell you what tapping a function does, and with no room for explanatory text on the small Apple Watch screen, using the watch is often trial and error. It's not obvious if there's a connection with the mothership iPhone or not, whether there's GPS reception, what apps may be sucking power in the background, how to close down such apps, whether or not the heart rate sensor is on or not or what its sampling rate is. And so on and so on and so on.

The Apple Watch OS is said to be much simpler and more intuitive than it was in its original version. Problem is that simplicity is a two-sided sword. If it's done well, a system simply works and you don't have to worry about a thing. But if necessary information is withheld for simplicity's sake and you never know what the system does, it can become quite frustrating. And that is, to me, the Apple Watch's biggest shortcoming. Or one of them anyway, for there are a few others.

So far, the good:

I am still in the early phase of getting acquainted with the Apple Watch. I've had it less than a week. So let's start with what I like:

The Apple Watch is a very cool watch. The 302 ppi 1,000 nits OLED display is definitely "retina," very clear, very sharp, beautiful. The display is off when I am not looking at it, but comes on with just the slightest delay when I flick my wrist up to look at the watch. My favorite watch face is a very simple analog one. The four "complications" (it's difficult to refer to this and not use quotes...) I use are remaining battery, day and date, weather summary, and the message icon.

The battery life is better than I expected after my wife's experience with her watch. I know now that some apps eat up battery just as they do on an iPhone. Stay away from those or use them sparingly, and the Apple Watch can run two or three days between charges. And it charges quickly (less than two hours from empty to full).

Though there isn't anything just yet that I'd consider a "killer app," some apps do come in handy. Since the watch is always on me, I have time, weather, and some important notifications on me, always; I don't have to first look for the phone. When wondering what to wear or expect, a glance at the watch shows me the temperature and likely weather. I see messages right away when they come in, I can read them, respond, and even see pictures, albeit without being able to zoom in to see more detail.

I can see how the various workout and health apps and functions Apple built into the watch can come in handy. Likewise, several other apps have potential, though it will take time and some effort to figure which ones and how they are best used.

And the not so good:

Unfortunately, there's quite a bit that I don't like or that leaves me underwhelmed.

The overall impression I have for now is that the watch just doesn't know (yet) what it is and/or wants to be. In that it reminds me of the initial iPhone which did not have apps as we know them now, just some minimal onboard functionality and the ability to look at websites that had special iPhone screens. That quickly changed and today the iPhone is a very powerful multi-purpose computer that no longer needs its own mothership (iTunes on a Mac or PC) to function.

After almost two years on the market, it seems that the Apple Watch has not reached that stage yet. Yes, there are apps that run on the watch, but their functionality is minimal and it's NEVER clear what connection to the iPhone app they should or must have to do what. I haven't seen anything on the watch that comes close to being a killer app. The watch remans an adjunct to the iPhone. That limits the watch to being of potential interest only to a relatively small fraction of iPhone users who feel like they also need an Apple Watch.

You could well argue that the iPhone itself really doesn't have a killer app that absolutely everyone must have and that has changed life as we know it. It's the overall usefulness of the iPhone that is its killer app. The iPhone has long since stopped being a "phone." The phone aspect is just a legacy leftover that telcos can still milk for a service few still rely on, making old-style dial-up phone calls. But the iPhone itself, as the Android smartphones it caused, is a new and incredible tool that we can no longer do without. The Apple Watch, on the other hand, uses a legacy form factor, the wrist watch, to somehow extend the iPhone, which really does not need extending.

But what about the health apps?

But what about the health and exercise aspect that Apple is pushing? That's certainly laudable. The Apple Watch comes with apps that are closely synchronized wth the main health app on the iPhone. And since the watch can record the heart rate which the iPhone can't, and since the watch can record movement much better since it's always on the body, there's definitely more information to monitor and manage health related functions.

But for that to really work, you have to wear the watch all the time, make sure everything is set up just right, have the iPhone handy, and then figure out what it all means, as the health app is quite a voluminous beast.

Health is certainly good, and if the Apple Watch makes more people aware of their health and what they can do to lead a healthier lifestyle, then that is a noble and worthwhile thing for sure. But it's definitely not a killer app as it is of potential interest only for the fraction of the population who has an iPhone, and of those the fairly small subsection that also have an Apple Watch, and of that contingent only the relatively small number of watch owners who are into using all the health functions, and use them religiously (because otherwise they are meaningless).

Why I got the Apple Watch

Just to reiterate, even though I am an Apple fan, a gadget lover, a watch lover, and I report on technology for a living, it took me almost two years to get an Apple Watch. And the reason that made me finally do it is because I thought it'd be great to no longer need to take the big iPhone 7 Plus with me when I go running, and to also record heart rate data. That is what made me buy the Apple Watch. For everything else, I was just fine with just the iPhone.

So what has been my experience with the Apple Watch to record my running? I don't know yet. Because I have not been able to get my running app, the one I essentially bought the Apple Watch for, to record my runs. That's a major disappointment and frustration. I know that the innovative developer of the exercise software I use spent much time in creating an Apple Watch app to go with their iPhone app (unfortunately that also made the iPhone app progressively more complex and much slower).

Yet, I can't get the running app to work on my new Apple Watch. It will search for a GPS signal, but when the message finally goes away, I have no way of knowing if GPS is now locked in or not, since the watch doesn't have an indicator for that. When I push the Start button on the app, nothing obvious happens, and when I wake up the watch upon finishing the run, the app says Start again, having recorded nothing. Or perhaps it did, for a while.

Although I like my running app very much and it's probably among the most used apps on my iPhone, I googled for alternatives and read their reviews. That was sobering. Most reviews are unclear on whether or not the watch apps run independently on the Apple Watch, or if they need the iPhone. Which makes a huge difference. I downloaded a few to see how they might work for me. Unfortunately, many apps these days force you to set up an account with them before you ever get to launch the app. All I want is collect data for myself and store it on my device, folks!

Inconclusive

So I remain undecided on the Apple Watch. It has, so far, certainly not replaced my iPhone as my running companion, and at this point I am not sure it ever will. I have not found anything on the watch that I totally need, or even that seemed of compelling use. As a watch, it's quite nice and it is a conversation piece. An expensive one, of course, but then again, millions of people spend far more on watches that can do far less. -- Conrad H. Blickenstorfer, conradb212@gmail.com

Posted by conradb212 at 09:14 PM | Comments (0)

May 23, 2017

History repeats itself: it's now the Surface Laptop

So the long awaited Microsoft Surface Pro 5 has finally been unveiled as the "new Surface Pro." In its media release, Microsoft calls it "the next generation of the iconic product line and the most versatile laptop on the planet. The new Surface Pro delivers the most performance and battery life in a laptop that is this thin, light and quiet."

So right off the bat, Microsoft makes it clear that it now considers the Surface Pro a laptop and no longer a tablet. And just for good measure, within the first two paragraphs of the May 23 press release, Microsoft doubles up with calling the Surface 5 Pro "the Surface Laptop," "the ultimate laptop," "making the classic laptop feel fresh again" and talking about "the most versatile laptop" and "powerhouse laptop."

The message is super-abundantly clear. Laptop. Not tablet. Laptop. With a bit of sarcasm one might ask what took Microsoft so long this time to abandon the tablet. After all, mobile computing historians recall how quickly Microsoft absorbed early tablet initiatives over a quarter of a century ago as soon as the pesky PenPoint platform had been defeated, and how Microsoft's own 2001 Tablet PC morphed into convertible notebooks before the Tablet PC was even officially launched.

What does it all mean? To bluntly put it on the table: for the third time Microsoft has realized that Windows simply isn't a tablet operating system, and, for the role Windows plays on many hundreds of millions of desktops, it will most likely never be. Let's face it: with all of their myriad of functions, features and tiny controls, the Windows power apps are not tablet material. As long as the Windows legacy lives, and that may be for quite some time, using touch and pens for an OS and software culture that was designed, from the very start, around the mouse will never work.

Eventually that may change, but that won't be anytime soon. It's been five full years since Windows 8 was launched, and with all of Microsoft's might, the vast majority of the scarce touch-centric apps remain basic and fairly useless. Even under the latest version of Windows 10, touch is just a thin veneer on top of Windows as it's always been.

But what of the remarkable success of the Surface tablets? Microsoft hasn't been officially bragging about that as making its own tablet hardware really put them in direct head-on competition with their hardware partners whose business fortunes much depend on selling hardware for Microsoft's software.

It's easy to see why Microsoft was tempted enough to make its own hardware to actually go ahead with it. A good part of Microsoft's Windows problems is that the software is expected to run on whatever PC hardware it's installed. That means literally billions of possible permutations of hardware components. Compared with Apple, which makes its own hardware for its own software and has total control over it, that's a daunting challenge.

So why was the Surface so remarkably successful? Because Microsoft, for once, controlled both hardware and software, and took advantage of that to make really good tablets. And also because of the remarkable success of the tablet form factor in general.

But Microsoft is not stupid and has most likely been painfully aware that there's just no way that, given the gigantic Windows legacy behemoth hanging around its neck, Windows will ever work very well on tablets. As long as users stay in the "tablet mode" of Windows 10, things work marginally well, but those apps are really only sort of an Android Light. So even Surface users will likely spend much of their time in desktop mode with the keyboard and a mouse attached.

So why not go the "it's not a bug, it's a feature" route and rather than admitting that, as is, Windows isn't a good match for tablets, simply make the tablet more like a real Windows machine, like a laptop. Like the ultimate, most versatile, most powerhouse laptop. The Surface Laptop. One that works like a real old-style laptop with a decent keyboard, but one that, thanks to advancing technology and miniaturization can also be used as a tablet if the user comes across some truly touch-centric tablet software.

So there.

All that said, the specs, of course, look great. Kaby Lake all the way (conveniently precludes Windows 7), nice big 12.3-inch 2736 x 1824 pixel screen (really the same as the Surface Pro 4), impressive projected battery life, but rather costly ($2,199 with 16GB RAM and a 512GB SSD).

Just one thing. Though the Surface is now a laptop, it doesn't come standard with a keyboard nor a pen. Those are extras. Is a laptop without a keyboard still a laptop?

Overall, it's a bit odd.

Posted by conradb212 at 07:56 PM | Comments (0)

March 09, 2017

Are "mobile" sites really needed?

A few days ago I used one of the readily available website analysis tools to check RuggedPCReview.com. The resulting report gave me a stern "site not mobile-optimized" lecture.

"Mobile-optimized," of course, refers to the fact that sites on the one-size-fits-all world wide web are being viewed on a very wide range of devices with a very wide range of screen sizes. And it is certainly true that viewing a webpage on a 27-inch display is a very different experience from viewing it on a 4.7-inch smartphone.

That predicament was recognized early on, years and years ago, and for a while there were efforts to have alternate web experiences for small devices. Examples back in the 1990s were the NTT DoCoMo system in Japan, WAP (Wireless Application Protocol) as sort of a barebones browser, and numerous proprietary browser-type apps, none of which ever had a serious impact.

Most recently, "mobile-optimized" has come to mean web pages that automatically rearrange themselves when the default width of the page is wider (i.e., has more pixels) than the display it is viewed on. That's generally done with sort of a block system where blocks can sit side by side if there is enough room, or they stack themselves when there isn't. Or the blocks shrink in width, making everything taller and narrower. Which is just about the very opposite of good page layout. It's very difficult to both make a site look good and easy to read when it's done by shrinking and stacking and re-stacking blocks.

The bigger question is why it's even be necessary. Back in the day it might have made more sense. That's because most desktop and notebook screens were first 800 x 600 pixel, then 1024 x 768, and then gradually more, with 1366 x 768 and 1450 x 900 common. As a result, most websites were designed to fit into these standard formats. That was a problem for early mobile devices whose screens used the 240 x 320 pixel QVGA and at most the 480 x 640 VGA format for many years. To make matters worse, while most of those early mobile devices did have some sort of zoom features, the devices just weren't powerful enough to make that work.

Now look at the situation today. While, amazingly, desktops and notebooks continued for many years with the same old resolutions, handhelds made tremendous progress. Laptops, especially, continued on with coarse, grainy displays and didn't change until Apple came up with the "retina" display concept, i.e. pixels so small that the naked eye could no longer see them individually when looked at from a normal viewing distance. Desktop monitors, too, resisted change for many years. Even today, "Full HD" 1920 x 1080 is considered state-of-the-art on the desktop, though anyone who has ever worked on even a 24-inch screen with just "full HD" can attest that it's no fun. Even today, 4k displays remain very rare on the desktop, and even 2k displays are the exception.

Compare that sluggish progress to what's been happening on smartphones. Numerous smartphones now have 1920 x 1080 resolution, the same as most giant HDTVs. A good number offer 2k (2560 x 1440) screens, and there are even a few with 4k ultra KD (3840 x 2160) resolution. That's incredibly sharp on a small smartphone display!

What that means is that the average webpage very easily fits on most modern smartphones, and often with room to spare. Granted, while super-sharp, text and graphics and layout look tiny on even a big smartphone screen. That's what that wonderful pinching and zooming on capacitive touch screens is for! Combined with the superior performance of modern smartphones and small tablets, effortless zooming and panning around a webpage is a piece of cake. And, in my opinion, far, far preferable than having to deal with a dumbed-down "mobile-optimized" website that keeps tumbling and pinching its ugly blocks around. So there.

Is it easy to make even the most common and widespread technologies one-size-fits-all? It isn't, as Microsoft has learned the very hard way. But with the reality that most handheld devices have just as many (or more) pixels to work with as laptops and desktops, I see no reason to engage in needless "mobile-optimized" projects.

In fact, one of the nasty consequences of that rush to make those ugly stackable oddities is that we now have lots of corporate sites built for the (wrongly presumed) lowest common denominator. My bank, for example, now has a website that looks like it was designed for an early iPhone. It's one size-fits-all, and it looks forlorn, inefficient and ugly on a laptop or desktop, and super-inefficient to boot.

Progress isn't always progress.

Posted by conradb212 at 06:36 PM | Comments (0)

February 22, 2017

In search of a prepaid, transferrable SIM

At RuggedPCReview.com, we often analyze and report on mobile computing devices with integrated WWAN (mobile broadband) capability and a SIM card slot. SIM (Subscriber Identity Module) cards are smart cards that contain a subscribers phone number and certain data. Initially just used for the early GSM (Global System for Mobile communication) networks, SIM cards are now also used by carriers that based their networks on the rival CDMA (Code Division Multiple Access) technology, like Sprint and Verizon. However, those vendors use SIM cards only for 4G LTE.

Anyway, it would be nice for us to be able to test device wireless data capabilities while we review them. We can't always ask vendors and manufacturers to get us test units with activated voice and data service, and we cannot, of course, set up cellular service for every such device that comes into our lab. So I wondered if there is an equivalent of prepaid phones and cards for SIM cards. That way, we could load a SIM with so and so many minutes or gigs of data and then simply insert it into a SIM card slot for testing. And then transfer it into another device for testing when the need arises.

Problem is, there is a difference between prepaid phone cards and SIMs. With a prepaid phone, you buy minutes of air talk time for a particular phone and the prepaid cards simply replenish the allocated minutes for that phone. SIMs, on the other hand, contain your phone number and contacts and such, which makes the SIM inherently portable. The phone number and data on the SIM goes with a certain account and certain network, and so, presumably, as long as you had minutes or a data allocation on a network, you ought to be able to transfer SIM cards between devices that have SIM card slots.

Online search yielded conflicting information as to whether there are prepaid SIMs and if so, whether they could be inserted into another device without having to go through the hassle of setting up service again for the new device.

So I went to WalMart. They had aisles and aisles of phones and prepaid phones and prepaid phone cards, but just a single "SIM kit." It was not apparent from the packaging how it would all work, how minutes or data would be purchased, or what it would cost. I asked the folks at the electronics counter, and they not had no idea.

I tried Best Buy next. Best Buy, too, has a very large number of phones from various vendors and for various networks. I explained our situation -- that we needed a prepaid SIM that we could move from device to device for testing of their communications -- and the answer was twofold. First, Best Buy only sells phones and carrier service, not cards and prepaid stuff. Second, such a thing as I needed did not exist.

This seemed unlikely to me. The only difference between a prepaid phone and a prepaid SIM would be that with a SIM you could take your phone number and contacts and other data from one phone to another; service on the carrier side would be the same: so and so many minutes or so and so much data is allocated to a particular phone number.

So I went to Walgreens. Lo and behold, they had three SIM card kits, each costing $10 for which you got a nano SIM card with adapters for use in devices that take the larger micro and standard SIM cards. Of the three, the GoPhone kit looked the best, with its packaging claiming to work both on iOS and Android devices, and also showing "refill and go" cards on its package. Further, GoPhone uses the AT&T network that we already use at RuggedPCReview.

Back at the office I browsed through the voluminous documentation that included a frightening amount of fine print, the kind leads to the modern problem of never really knowing what something actually ends up costing. Or what extra charges would be added for this, that, and the other thing. Yes, there were 18 pages of AT&T legal fine print. Annoying.

But I did follow the basic directions, stuck the tiny nano SIM card into the standard SIM adapter that our first test device required, and then used a WiFi connection to go to att.com/mygophone and set up an account. I was pleased to see that nothing but an email was required, name not mandatory. Which, given that there is no contract and it's all prepaid, seems appropriate. I quickly got an email back with a phone number associated with my new SIM, and a request to now log on and pay. That, of course, instantly required a name and the usual detailed personal information. So goodbye (relative) anonymity), and can't you guys just give me the full scoop right upfront?!

And it's really not just buying minutes and then using them until they are gone. An AT&T account is required and you have to sign up for a service and the service automatically renews every month, etc., etc. You CAN purchase refill cards and that, presumably, avoid using credit cards or automatic bank payments. So I'll look into the refill cards just for the sake of it. However, what I wanted most, a SIM that had so and so many non-expiring minutes on it, that I didn't get. Whatever you don't use you lose, and every month there's a new charge even if you didn't use the phone for even a single call. Boo!

Anyway, the device worked just fine. But after being done testing it, would I be able to transfer the SIM to another device with a different operating system? I tried that by going from an Android device to one using Windows 10 Mobile. It worked. No fuss no muss, the new device simply had service and worked just fine. These two devices, however, had virtually identical (or identical) hardware. What about putting the SIM into a different handheld or smartphone?

I tried that with another Android device of a different make and a different version of Android. And that one used the micro SIM format instead of the standard format. I found that popping the tiny nano IM card in and out of its fragile plastic adapter isn't something one wants to do very often. And also that most SIM card slots obviously weren't made for frequent insertion and removal of cards. They are very fragile.

But the device came right up. No problem. What is confusing these days is figuring out what uses data and what still uses minutes. Most phone plans have now been changed ot data plans, but the inexpensive phone-only plans still use minutes. And that's what the (relatively) inexpensive GoPhone plan does. So the phone wants to know if data use is permitted? No, I guess not. But it still shows data use as on, which greatly raises paranoia of stepping into one of the phone companies' many gotcha traps.

The phone, however, works. And so far I've been able to transfer the SIM from one unlocked device to another without any issues.

Posted by conradb212 at 09:03 PM | Comments (0)

December 15, 2016

Mobile Operating Systems Crossroad?

Interesting situation with mobile operating systems in the industrial and enterprise space. For many years, Windows Mobile (later named Windows Embedded Handheld) was the OS of choice, but then Microsoft stranded it. The industry hung in there with the abandoned mini OS for a number of years, but then slowly began looking at Android as a replacement. But now, just as Android is gathering steam in industrial handheld markets, Microsoft finally introduced a Windows Mobile replacement in Windows 10 IoT Mobile Enterprise (what a name!). So major rugged and enterprise mobile hardware vendors are beginning to offer devices that support both of those operating systems (the latest is the Zebra TC70x.

Obviously this situation requires some detailed analysis. The rugged handheld devices market isn't huge, but according to VDC we're still talking several billion dollars, not exactly spare change. In many respects, Android has been its own worst enemy with version fragmentation and emphasis on consumer markets, but as of late the platform has made significant strides in becoming more acceptable to enterprise users. On the Redmond side, Microsoft's neglect of its erstwhile market-leading mobile platform and lack of serious follow-up on the final feasible version (6.5) may well have driven many customers to take a wait-and-see approach to any mobile version of Windows 10.

What does make Windows 10 IoT Mobile Enterprise interesting is the different approach Microsoft is taking with Windows 10. Instead of having different versions of Windows for different markets, this time Microsoft is using a "unified core" that's common to all versions of Windows. That doesn’t mean there’s just one Windows that runs on every type of device. That wouldn’t make sense given the great variety of devices and their variations of purpose, size and resources. But under Windows 10 there is a common core with each family of devices then adding suitable features to that core. In principle, that means that developers don't have to write different code for different Windows devices. If that turns out to be feasible, it is a major selling point for Windows 10 IoT Mobile Enterprise.

And it means that those in the mobile computing hardware business need to watch that situation very, very closely.

Posted by conradb212 at 04:08 PM | Comments (0)

November 30, 2016

Sharp, clear web images

RuggedPCReview readers who view our site on a modern tablet, smartphone or high resolution monitor have probably noticed that many images on RuggedPCReview.com are noticeably crisper and sharper than those on the vast majority of websites. Why is that? It's because earlier in 2016 we started serving images in both standard and high resolution. That way, if your browser automatically detects that you're viewing a page on a device with a high resolution display, it'll serve a high resolution image. If you're viewing it on a standard display, it'll load a standard resolution picture. Let me explain.

While high and higher resolution is a huge deal in consumer tech and advertising (think 4k TV, megapixels in cameras, and smartphone resolution), all that ginormous resolution is totally useless when creating images for websites. For example, a 4:3 aspect ratio picture from a 12-megapixel smartphone camera has 4000 x 3000 pixels. That's more than fit on a giant 4K TV screen with its 3840 x 2160 pixels, let alone a "high definition" laptop screen with its measly 1920 x 1080 pixels. Now add to that the fact that most pictures on web pages are quite small. And that web pages need to load as quickly as possible. And that the image resolution standard for web page pictures really hasn't changed in almost a quarter of a century (it's still 72dpi or 96dpi).

What that means is that while cameras and TVs, and phones and even some monitors have wonderfully high resolution today, the pictures we're viewing on the vast majority of websites are presented in dismally low resolution. They are fuzzy and grainy and really a bit embarrassing (though one might argue they fit right in with those terrible Wordpress layouts so many of todays' websites use).

So what can be done? Well, we code our web content so that folks who view our site on a high-resolution device will get high-resolution pictures. Easy as that. Or rather, not quite as easy as that. Because while at RuggedPCReview.com we found a way to serve high-resolution pictures without slowing down page loading, figuring out how to do that wasn't easy. And it's extra work to generate, and code for, multiple versions of every image. But we figured our viewers are worth the extra effort, as are our sponsors, and the terrific technology invested in all that rugged technology we report on, analyze, and test.

So if you've noticed that RuggedPCReview.com is different not only due to the total absence of annoying pop-ups, lame click bait, and ads you must read before you get to the site, but also in the very sharp and clear images we present, you're viewing the site on a contemporary high-res display, and we made sure it's the best possible viewing experience for you.

But aren't there still some older, fuzzy images on the site? Yes, old content remains as it was. But any new content is now created with multi-resolution pictures, and we're hard at work updating old imagery as well.

Posted by conradb212 at 04:27 PM | Comments (0)

October 31, 2016

Rocky (not Balboa) has left the building

Back in 2003 we approached the then-titans of the rugged notebook industry with this challenge: "Send us whatever you consider your best all-purpose rugged notebook computer for a roundup!" Who did we send that challenge to? You'd think Panasonic, Getac and Dell or GammaTech. Panasonic, yes, but back then the other two we chose for the shootout were Itronix and Amrel.

Panasonic, of course, had been making rugged notebooks since 1996, and Itronix, too. Itronix, which at time was a subsidiary of Telxon, had sent us one of their X-C 6000 Cross Country computers for review in the mid-90s. We wrote that "its bulldoglike ruggedness means you never had to worry about it." Back then, incidentally, we also reviewed the all-magnesium M3I PCMobile rugged laptop whose keyboard was removable, so the current trend towards 2-in-1s isn't something new (think 1994 Compaq Concerto).

Anyway, we reviewed the rugged Amrel Rocky II in 1998. We liked it and commented that "the unit's extraordinary ruggedness and clever, flexible sealing mean that it can be used just about anywhere." And actually introduced it in the article as "representative of the re-emerging class of rugged pen-enabled clamshell notebooks." That's right, re-emerging, back in 1998. Because there had been tough pen-enabled notebooks back in the early 1990s.

So that explains why it was Amrel and Itronix that were duking it out with Panasonic for the best rugged notebook in our 2003 shootout. Who won? It was pretty much a draw. We reported that "at decision time, what it may come down to are very specific requirements that only one of the three may be able to fulfill." We loved the Amrel's individually sealable connectors and its ultra-sealed keyboard. And the fact that it was the only competitor with zero flex. We loved the raw processing power of the Itronix GoBook. And we praised Panasonic for its "overall fit and finish that no one can match."

Getac was also already on the rugged scene at the time of our 2003 comparo. We'd examined their mid-range A320 in 2000, and the big A760 with its extra connectivity was also available. I should mention that the debate about what differentiates rugged and semi-rugged was as heated back then as it is today. With the possible exception that what used to be considered "rugged" a decade and a half ago would now be sold as "ultra-rugged" or "fully rugged," and what used to be "semi-rugged" is now routinely called "rugged."

Be that as it may, back in 2003 it was Panasonic, Itronix and Amrel we asked to step up to the plate. Panasonic, of course, went on to become the market leader in rugged notebooks, a position the company still holds on to today. Itronix was bought out by defense giant General Dynamics in 2006 or so, hung in there for a couple more years as General Dynamics-Itronix before GD shut them down.

Amrel, however, bravely soldiered on, even in the light of increasing competition from Getac and then Dell. While rarely at the technological forefront in terms of processors and ancillary technology, the company dutifully delivered very good rugged notebooks in the 13.3-inch, 15.1-inch and even 17-inch class, And that on top of also offering a full roster of 8-inch to 12-inch rugged tablets, and an equally impressive lineup of rugged handhelds that included Android-based models and special versions for biometric identification and such. Only a year ago, in late 2015, Amrel launched the ingeniously simple "Flexpedient" AT80 Android tablet that combined Ford Model T simplicity with modern technology and seemingly unlimited application potential.

We often wondered how Amrel managed to hang in there with modest advertising and only modest pursuit of review and PR opportunities. We figured it probably was because they very narrowly focussed their efforts on their target markets and had no interest in becoming known to a wider circle of potential customers. But they did hang in there, and we had occasional contact with Amrel and discussions of areas of interest to the rugged markets.

Where did Amrel come from in the first place? From the bits and pieces of available information, it seems that the company was formed by Edward Chen, who had once been a VP at Motorola and had also co-founded Crete Systems in Taiwan around 1990. Amrel and Crete Systems had a close relationship, with Crete apparently the ODM/OEM for most of Amrel's rugged computers. There were also AMREL's subsidiary, Bionetek Corporation, that developed an early cardiac diagnostic system, and Solarmer Energy that was into polymer solar cells. There further was MilDef that had partnered with Amrel since the mid-1990s, and there was German Roda that was launched in 1987. There was the MilDef Group that consisted of MilDef Systems in Sweden, MilDef Ltd in England, MilDef AS in Norway, and MilDef Crete in Taiwan. Everything related in one way or another.

Why mention all this? Because after all these many years, in September 2016 Amrel in Southern California suddenly quit the rugged computer business and sold its its computer division to MilDef Group. The news release stated that "MilDef Inc. will carry on AMREL’s legacy under its MilDef brand name and continue providing our customers exceptional service and support after the acquisition." So most of the Amrel Rocky handhelds, notebooks and tablets are now sold as MilDef products.

Since Amrel had such a long history in rugged computing we, of course, wondered what happened. That doesn't seem to be clear even to insiders. Our email inquiries to both Amrel and MilDef went unanswered or shed little light on what happened. We consider that regrettable. There's too much history in that quarter of a century of rugged Amrel computers to simply go away. And though MilDef has taken over the line, at a time where the rugged notebook market leaders are charging full-speed ahead, it'd be good to know who MilDef is and what the company's intentions are, especially on the US market. After all, we're talking about a product line that once had battled the leaders in the field to a draw. They should not just vanish into obscurity.

Posted by conradb212 at 10:54 PM | Comments (0)

September 05, 2016

Intel introduces Kaby Lake, the 7th generation of Core processors

In August, Intel officially introduced the first few of its 7th generation Core processors, codenamed "Kaby Lake." That comes at a time where the news about PCs generally isn't very good, where Microsoft has a very hard time convincing users to switch to Windows 10, and where it's becoming increasingly more difficult for vertical market hardware manufacturers to keep up with Intel's rapid-fire release of new generations of high-end processors.

7th generation Kaby Lake also comes at a time where 4th generation "Haswell" processors are considered quite up-to-date in the mobile and industrial PC arena, 5th generation "Broadwell" makes customers wonder how it's better than Haswell, and 6th generation "Skylake" leaves them befuddled because, well, what happened to Broadwell? And the rather expensive (US$281 to US$393) new 7th generation chips also come at a time where customers balk at paying more than a hundred bucks for a tablet or more than a two or three hundred for a basic laptop.

So what is Intel thinking here? That they simply must follow "Moore's Law" which says that the number of transistors that fit on a given piece of chip real estate doubles every 18 month? Or that, like Disney, catering to a small clientele to whom price is not an issue is the profitable way to go? It's hard to say, especially since the generation game really hasn't been about meaningful increases in performance for a good while now.

That's certainly not to say that the new chips aren't better. They are. Intel loves to point out how many times faster new generations are per watt than older ones. And that's really getting closer to why all of this is happening. It's mostly about mobile. See, back in the day everyone knew that you just got an hour and a half max from a notebook before the battery ran out, and that was grudgingly accepted. But then came Steve Jobs with the iPad that ran 10 hours on a charge. And somehow that's what people came to expect.

On desktops, performance per watt hardly matters. You plug the PC in and it runs. Compared to heating and air conditioning, toasters, ovens, TVs and a houseful of light bulbs, whether a chip in a desktop runs at 17 watts or 35 watts or 85 watts hardly matters. But in mobile devices it does. Because Steve Jobs also decreed that they needed to be as slim as possible, so big, heavy batteries were out. It all became a matter of getting decent performance and long battery life. And that's one of the prime motivations behind all those new generations of Core processors.

Now combine that battery saver imperative with a quest to abide by Moore's "law" (which really just was a prediction) and — bingo — generation after generation of processors that each is a bit more efficient and a bit quicker.

How was it done? By coming up with a combination of all sorts of clever new power-saving techniques and by continuously shrinking the size of the transistors that are the basic building blocks of a microprocessors. To provide an idea of just how small things are getting inside a microprocessor, consider this:

A human hair is on average about 100 micrometers thick, a tenth of a millimeter or about 4/1000th of an inch. The 8080 processor that started the PC revolution in the late 1970s with early microcomputers like the MITS Altair was based on 6 micrometer lithography, or "process technology." Process technology is generally defined as "half the distance between identical features in an array." So the smallest distance between two transistors in an 8080 was 12 micrometers, or about an eighth of the thickness of a human hair.

Over the decades since then, process technology has been miniaturized again and again and again. Whereas with that old 8080 chip (which cost three or four bucks at the time) it was 6 micrometer, which is 6,000 nanometers, the 7th generation of Intel Core processors use 14 nanometer process technology. Down from 6,000 to 14. So whereas the old 8080 had about 6,000 transistors total, with 14 nanometer process technology, Intel can now fit over a billion transistors onto the same amount of chip real estate. And with the die size of your average 7th generation Core processor larger than that of the little old 8080, it's probably more like five billion transistors or more. The head spins just thinking about it.

The upshot of it all is that the hugely larger number of logic gates on a chip offer vastly greater computing performance which you'd think would require vastly more power. But thanks to the hugely smaller size of all those transistors, that's actually not the case. Between the tiny size and all those logic gates available to run ultra-sophisticated power-savings operations, the chips are both more powerful and use less energy.

Now that said, there appears to be a law of diminishing returns. It's a bit like video games where early on each new generation had much better graphics, but now things are leveling off. The visual difference between mediocre and acceptable is huge, the difference between very good and terrific much smaller, and the difference between super-terrific and insane smaller yet. Same with processor technologies.

As a result, the performance and efficiency increases we've seen in the benchmark testing we do here in the RuggedPCReview lab have been getting smaller and smaller. By and large, 5th generation Broadwell offered little more than 4th generation Haswell. And 6th generation Skylake didn't offer all that much over Broadwell. The last really meaningful step we've seen was when 4th generation Haswell essentially allowed switching mobile systems from the standard voltage to ultra-low voltage versions of chips for much better battery life (or a much smaller battery) at roughly the same performance. Yes, each new generation has tweaks and new/improved features here and there but, honestly, unless you really, really need those features, larger power gains are to be had via faster storage or a leaner OS.

So there. As is, as of late Summer 2016, there are now six 7th generation Kaby Lake Core processors, all mobile chips. Three are part of the hyper-efficient "Y" line with thermal design power of just 4.5 watts, and three of the only super-efficient "U" line with TDPs of 15 watts. The primary difference between the two lines is that the "Y" chips run at a very low default clock speed, but can perform at a much higher "turbo" clock speed as long as things don't get too hot, whereas the "U" chips have a higher default clock speed with less additional "turbo" headroom. Think of it like the difference between a car with a small, very efficient motor that can also reach very high performance with a big turbo, versus a vehicle with a larger, more powerful motor with just a bit of extra turbo kick.

In general, Intel has been using what they call a "tick-tock" system where generations alternate between "tick" (yet smaller process technology, but same microprocessor architecture) and "tock" (new microprocessor architecture). By that model, the 7th generation should have switched from 14nm to 10nm process technology, but it didn't and stayed at 14nm. Apparently it gets more and more difficult to shrink things beyond a certain level, and so Intel instead optimized the physical construction of those hyper-tiny transistors. That, they say, allows things to run a bit cooler and requires a bit less power, resulting in, according to Intel, a 12-19% performance gain, mostly through running the chips at a higher clock speed.

The architectures of both the cores and the graphics haven't really changed. But there are some additions that may be welcomed by certain users. For example, Kaby Lake has much better 4K video capability now, mostly in the hardware encoding/decoding areas. And a new implementation of Speed Shift lets the CPU control turbo frequency instead of the operating system, which means the chip could speed up much faster. We'll know more once we get to compare Kaby Lake performance and efficiency with that of the predecessor processor generations.

There's some disturbing news as well. Apparently, some discussions and agreements between Intel and Microsoft resulted in Kaby Lake not really supporting anything before Windows 10. We don't know if that means older versions of Windows simply would not run, or just that they wouldn't run well. Given that so far (early Sept. 2016), Windows 10 only has 23% of the desktop OS share, any restriction on using older versions of Windows on new chips seems both ham-fisted and heavy-handed.

For a detailed tech discussion of all things Kaby Lake, check AnandTech.com here.

Posted by conradb212 at 08:16 PM | Comments (0)

August 29, 2016

Congrats to Xplore Technologies: 20 years of rugged tablets, and only rugged tablets

At the January 1997 Consumer Electronics Show in Las Vegas, I walked into the South Hall of the Las Vegas Convention Center on the lookout for something — anything — new and exciting in tablets or pen computers. Sure, Microsoft had announced Windows CE at the Fall Comdex in response to Apple’s Newton Message Pad and the emerging “Palm Economy,” and our bi-monthly Pen Computing Magazine was doing well. But, by and large, handhelds and tablets were very far removed from the booming world of desktop computers and laptops and printers and the latest of absolutely-must-have PC software.

But there, amidst all of the glitzy, glossy booths of mainstream computing was… an even glitzier and glossier booth by a company I had never heard of. They called themselves Xplore Technologies, and they were thinking big. There had, of course, been rugged computers before, but most were quite utilitarian and often looked a bit unfinished. Xplore’s Genesys, on the other hand, looked like something right at home on the Starship Enterprise. Cool industrial design, bold lines, even bolder plans. Having seen my share of grand plans I admired the effort but wasn’t convinced that these folks' vision was actually going to see the light of day, let alone become a success. However, between a persuasive VP of Marketing, the grand booth, and the look of the various models (there weren’t any fully functional production units yet), I agreed to an interview with Xplore's boss and committed to coverage in our print magazine.

And so this is what we ran in Pen Computing Magazine Volume 4, Number 15, page 41, in early 1997:

Xplore Genesys

Pen technology “dream team” presents impressive new system

Every so often, individuals — or groups of individuals — get dissatisfied with the status quo and set out to create new solutions, new forms of government, new companies. or whatever it takes to make things right.

One such group of individuals found the general status of mobile and pen computing lacking and joined together to form Xplore Technologies Inc. Xplore has an impressive roster of seasoned professionals from all areas of mobile computing, both on the vendor and customer sides. Founding members have earned their professional experience and reputations at companies such as GRiD, Telxon, Motorola, Intel, Fujitsu, Telular, and a number of vertical market industrial clients.


Fueled by a common vision of offering technologically advanced “whole product solutions," a firm belief in an annual pen computing market growth rate of over 30%, and financing by a small group of supportive investors, the Xplore team is conjuring up a compelling business strategy based on tactical partnerships with companies that provide products and services complementary to Xplore’s technology offerings.

Believing that existing pen and mobile systems often fail because they are created by technology companies without real knowledge of their markets, Xplore not only recruited industry representatives into their core staff, but also developed the specs of their “Genesys” product family in conjunction with customers from their targeted markets — utilities and public safety. The result is a very functional, very attractive design that’s both technologically up-to-date and ready for future expansion, a necessity in markets where equipment is expected to have a life cycle of several years.

The Xplore Genesys pen computer, much like the TelePad 3, is based on a main “brain," or core, that houses the main logic board, power, memory, and screen, and X-pods that contain peripheral functionality, such as GPS systems, additional batteries, wireless communications, and various I/0 options. The X-pod expansion bay is shaped so that it doubles up as an ergonomically shaped hand grip for the unit. Xplore calls both the core unit and the X-pods “environmentally indifferent,” i.e. water resistant, with shock mounted components in a sealed composite (or optional aircraft aluminum) inner housing for the core, and equally impressive sealing of the pods. The unit is further protected with impact resistant exterior moldings, all combining to give a Genesys computer a good chance to survive a 4-foot drop onto concrete.

As should be expected from a brand-new, "clean slate" design, the Genesys includes thoroughly modern components, starting with a very-low voltage lntel Pentium processor running at 133Mhz; two color including a TFT High-Brite version, and one monochrome LCD screen options, all offering a large 10.4" diagonal viewing area and 800 x 600 SVGA resolution; 64-bit PCI bus architecture; an electrostatic pen interface with touch screen functionality; no less than three docking systems; and — of course — Windows 95. There is room for up to 64MB of RAM and up to 3.2GB of hard disk space.

Since Xplore projects a significant number of Genesys slates to be vehicle mounted, special care was given to an optimally designed vehicle dock. The airbag zone compliant dock has a separate breakout box for cable management and uses standard PC connectors. The desktop dock provides access to CD-ROMs, LANs, modems, keyboards, and external monitors. essentially turning the Genesys into a fully functional desktop computer. The office dock, finally, is a space-saving design with a LAN controller that allows mounting up to four tablets on the wall for easy access. All docks are based on the same mechanical docking head, and all offer fast charging capabilities and expanded intelligence through an I/0 controller in the docking head.

As of this writing (January 1997), Xplore was in the process of assembling final Beta units for testing with a limited number of customers in February 1997. According to Xplore, production is scheduled to begin in late March.

Our impression? What we have here is a high powered group of very qualified people developing and marketing what they believe is the very best product for the pen computing and mobile market. This is good news for pen technology in general, and for companies seeking a state-of-the art mobile solution in particular. — Conrad H. Blickenstorfer, Pen Computing Magazine

And here's what it looked like in that early 1997 issue of Pen Computing Magazine:

It’s hard to believe that it’s been almost 20 years since I wrote that article. And pretty much exactly 20 years since Xplore began making rugged tablets. Back then, Xplore’s competition included Teklogix, Dauphin, DES, Epson, Granite, Husky, IBM, Itronix, Kalidor, Microslate, Mitsubishi, Norand, PGI Data, Telepad, Telxon, Texas Micro, WalkAbout and others. All gone, absorbed, or no longer in the rugged tablet business. Xplore, however, is not only still here, but expects fiscal 2017 revenue of between US$85 million and US$95 million. And Xplore is #2 in global rugged tablet marketshare. Quite impressive.

It hasn’t been an easy ride for Xplore. There was customers’ general reluctance to embrace the tablet form factor. There were the special demands of tablets that always seemed a year or two ahead of available technology. Despite Microsoft Windows for Pen Computing and then the Tablet PC Edition of Windows XP, Windows never was a natural for tablets. So business was hard, even after the iPad opened the floodgates for tablets.

Yet here Xplore is, now with the complementary product line of fellow tablet pioneer Motion, stronger than ever. It’s ironic that while once it was lack of acceptance of tablets that was Xplore’s biggest problem, now it’s the very success of tablets that’s a challenge — with tablets so cheap, many potential customers just buy consumer tablets and stick them in a case.

So after 20 years of making tablets and nothing but tablets, questions remain. On the far end, how rugged is rugged enough? What degree of ruggedness is compelling enough to sway possible markets, and at what price point? How can one profitably grow while remaining under the radar of consumer electronics giants (so they won’t start an “active” or “outdoor” or “adventure” version of one of their products)? None of these questions are easy to answer. Or the answers easy to implement.

But having been around for 20 years and having the benefit of all that experience, few are in a better position to succeed than Xplore Technologies. Here's to the next 20, Xplore!

Posted by conradb212 at 06:05 PM | Comments (0)

August 09, 2016

Why we take things apart and show what's inside

At RuggedPCReview, we take things apart. We open up handhelds, tablets, panels, notebooks and industrial PCs. We dissect them methodically, documenting our progress, jotting down observations and commentary. What we find inside a product becomes part of our detailed reviews, including pictures of the insides and of interesting details.

We do this because ruggedness isn't something that's just skin-deep. Truly rugged mobile computing devices are designed from the ground up to be tough and rugged and being able to handle the various kinds of abuse they may encounter in customers' hands (and falling out of customers' hands). While the outsides of a successful consumer product must look good and appeal to the eye, a rugged product must look good inside, too, and by "look good" we mean designed and built to handle abuse. For us here at RuggedPCReview, that means it's mandatory to look inside and describe what we find. Else we wouldn't do our job.

We've felt this way for a very long time. Ever since, back in the mid 1990s, we reviewed a tough-looking tablet its manufacturer said was specifically designed for the military and operation under the harshest conditions. It looked very tough indeed, but when our editors took it apart, it was like a half-finished science project inside. There were wires and loose connectors everywhere, things were not fastened in place, seals were inadequate or non-existent, and the internal layout and organization did not make sense. There was no way that product was going to hold up out there in the field. Not surprisingly, that company went out of business shortly thereafter.

It was then that we decided to review what's inside a rugged device as carefully as we describe and document what's outside. We love taking pictures that show off a product out there in the muck, rain, water, snow or ice, because those are the extreme conditions rugged computing products are being designed for. But we also show what's inside. Because what's inside, the computer, is what the tough and rugged exterior must protect, and even the hardest shell cannot protect the guts of a rugged system if it's not designed and built right inside.

By and large, the guts of today's rugged products are far, far better than we've seen in the past. We used to see plenty of seals that could not possibly seal, plenty of connectors that could not possibly stay connected, plenty of parts that were certain to break, plenty of layouts that were too complex to work, and plenty of cooling systems that could not stay cool. We saw plenty of foils, conductive material, seals, screws and soldering that could not possible survive even the first time the unit was taken apart for repair or maintenance. We saw plastic clips that would break, screw sockets that would fail, seals done wrong (or omitted entirely), and materials that simply made no sense.

It is better now, and perhaps our many years of documenting and discussing what's inside rugged systems how they are made, has contributed in a small way to that progress. And even if not, it has probably helped raise awareness of interested parties in what's inside of all those important and often costly tools for tough jobs, tools that must not fail.

The vast majority of manufacturers we have worked with over the years understands that. Most take pride in the internal quality of their products and appreciate our documentation of the insides of their products with photography that's often much better than what even the FCC does.

But every once in a while, we're told we must not open a device or must not publish pictures of what's inside. Stated justification for the former may be that a unit is sealed and opening it would destroy the seal and reduce or eliminate ingress protection. We don't consider that a good argument for two reasons. First, we can't recommend a product when we're not even allowed to look inside. And second, if seals break when the unit is taken apart, that makes service expensive, difficult and inconvenient, big negatives all.

We've also had a very few requests not to publish interior pictures because then the competition would know how it's done and steal the design. That, likewise, we do not consider a good argument. If the competition is indeed concerned enough to want to know what's inside a product, they will simply buy one and see for themselves (that happens all the time, everyone does it). But what if designs are "stolen"? Still not a good argument; one cannot easily copy an entire design from a picture. We're not talking rearranging Lego blocks here.

By and large our experiences with the industry have been overwhelmingly good. Almost everyone is helpful and genuinely concerned about making the best possible products. Project managers, in particular, take great pride in the designs they are entrusted with. Most love to share, discuss issues, answer questions, and appreciate feedback. Most marketing people we work with are also great sources of information as well as helpful conduits to/from technical staff and PMs.

Reader and site visitor feedback is uniformly in favor of detailed reviews that show both the outside and the insides of the products they are interested in. It helps them make more educated purchasing decisions.

So that is why we here at RuggedPCReview take things apart and show what it looks like inside. We could save ourselves a lot of time and effort not doing it, but then we wouldn't be doing our job. And we wouldn't do a favor to manufacturers who often learn from our third-party analysis, and we certainly wouldn't do a favor to our readers.

Posted by conradb212 at 05:18 PM | Comments (0)

June 29, 2016

The Microsoft Surface mystery

According to numerous reports online, Microsoft will apparently stop offering the Surface 3 tablet by the end of 2016 and it's not certain if there'll ever be a Surface 4. Microsoft, of course, has had a checkered past with its forays into hardware, and many of the company's hardware partners likely have mixed feelings about the Surface tablets that are direct competition for their own products.

Yet, the Surface tablets appeared to have been quite successful. After a rocky start with the wing-clipped Windows RT tablets, sales of Surface tablets running real Windows looked very good. Back in February 2016 we reported on IDC estimates of 1.6 million Surface tablets sold in Q4 of 2015. Most of them were Surface Pro models and not the lower end Surface 3, but anytime a tablet product line sells in the millions, we'd see that as a success. For the full fiscal 2015, Surface sales amounted to US$3.6 billion, a big increase over 2014's US$2.2 billion, which already had greatly exceeded 2013's sales of under US$1 billion.

So what do we make of that? Why would Microsoft give up on the Surface 3 and perhaps not even offer a Surface 4? Most likely because selling low-end tablets is just not profitable. The predicament's clear: with "white box" tablets practically being given away, and even brand name tablets selling for very little, price is an issue in the tablet market where differentiation in anything but performance is difficult.

And performance costs money. Even mid-range Intel Core processors cost hundreds of dollars at a time when even good Android tablets can be had for a fraction of that amount. So in order to compete, Windows tablets had to resort to low-end Intel processors, mostly from the Atom and perhaps Celeron lines. Some Atom chips cost no more than 20 bucks or so in quantity, and so there's plenty of temptation to use them in lower-end Windows tablets.

Which is almost always a big mistake. The mobile computing market is littered with failed products whose designers had tried to keep costs down by using cheap Atom chips. The low, low price of netbooks had seduced tens of millions, who then quickly found out that acceptable performance just wasn't there. Same with numerous rugged tablet designs, almost all of which ended up switching to higher-end processors.

So perhaps the retreat from low-end Surface tablets is just an admission that for general purpose Windows computing, low-end processors just can't hack it. So between unhappy customers on the one side, and unhappy beancounters who don't see much profit from these low-cost Windows tablets on the other, it's a losing proposition. This does not mean that Intel's non-Core chips are generically bad choices for personal computing devices, just that there's a very delicate tipping point where the plus of a low price is outweighed by the minus of lower performance.

So perhaps that means Microsoft will be concentrating on the higher end of the tablet market, where the Surface Pro models have done well and the profit margins are higher. Add to that the Microsoft Surface Book and the raising interest in 2-in-1 hybrid tablet devices, and Microsoft's retreat from low end tablets may just be a clever shift in market focus.

2-in-1s, of course, have their own issues. While a device that can both be used as a tablet and as a conventional laptop is a compelling idea, it's surprisingly difficult to implement. The concept has been around for a quarter of a century, but with few actual products that reached more than specialized niche markets. So anyone entering that market would be well advised to examine earlier 2-in-1s, and what kept them from breaking through.

Posted by conradb212 at 02:43 PM | Comments (0)

May 24, 2016

Household items: coding, standards, and "2x" pics

Back in the day when we published Pen Computing Magazine and Digital Camera Magazine and some other titles in print, we always prided ourselves to be in total control of our own destiny. We did virtually everything inhouse — writing editing, photography, layout, prepress, web, marketing and advertising — and most of us had mastered several of those disciplines. We didn't want to farm anything out or rely on any one expert.

We felt the same about software. We had our own webhosting, our own servers right in our building, and we programmed everything ourselves. That way, no one could force us to update or upgrade when we didn't want to, no one could quietly put more and more unrelated ads, pop-ups and clickbait onto our pages, and no one could suddenly go out of business on us. No one could control us or buy us out either, because we financed everything ourselves. One doesn't get rich that way, but it pretty much guarantees continuity and longevity.

That doesn't mean we didn't run into issues. The author of a terrific piece of software that we used and loved was of the paranoid sort and, even though we paid a hefty price for the system, insisted on compiling everything and locking the software to our particular hardware. So every time we upgraded our servers even in a minor way, we had to go and beg the man for a new code. That became increasingly more difficult, and eventually he refused altogether.

Fortunately, that was an isolated incidence, which is a good thing as we use dozens of tools and utilities that run on our own server and without which we couldn't do business. Many are orphaned or haven't been updated in many years. But they still work, and they do the job better than what replaced them.

RuggedPCReview.com is a vast site with thousands of pages. Yet we don't use a content management system or anything like it. We handcode everything. Sure, we have utilities and scripts and routines that make the job easier, but when a new page goes up, it hasn't been generated by rev. 17.22 of version 5.21 of some corporate software system. It's all coded by hand.

Don't get the idea, though, that we're hidebound and unwilling to go with the flow. We routinely evaluate whatever new tools and systems that come along. A few years we analyzed HTML 5 and recreated part of RuggedPCReview in pure HTML 5. It was an interesting and stimulating exercise, and we adopted part of HTML 5, but didn't see a need to convert everything.

More recently we took a look at Word Press. Like Movable Type that we still use (and run on our own server), Word Press started as just blog software. It's now morphed into a full-fledged content management and site generation system, one that's replacing more and more conventional websites. As we had done with HTML 5, we analyzed Word Press and recreated RuggedPCReview in Word Press.

We rejected Word Press for a variety of reasons. First, we used tables everywhere, and Word Press is terrible with tables. Second, Word Press is based on modules that are pretty much black boxes. You don't know what they do and how (unless you want to dedicate your life to learn and decipher Word Press in detail). We don't like that. Third, Word Press layout is terrible. Even the best templates look like pictures and text blocks have randomly been dropped on a vast ocean of white background. And forth, and most egregiously, with Word Press sites you never know if an article or posting is current or three years old.

So thanks, but no thanks. Which means that when we need to implement a new feature on our site, we have to be creative. A couple of years ago one of our much appreciated sponsors was unhappy that sponsor logos were listed alphabetically, which meant that some sponsors were always on top and others at the bottom. A reasonable complaint. Word Press likely has some black box for that, or maybe not. Our solution was to find a script and modify it for our purposes. It's been working beautifully.

Technology advances at a rapid pace, of course, sometimes for the better and sometimes you wonder what they were thinking because what came before worked better. That's mostly the case with software; hardware advances are generally a good thing. But here are a couple of examples of how advances in hardware affect running a site like RuggedPCReview.

There was a time when the web was on desktop and laptop monitors, and phones either didn't have anything like the web, or some separate abbreviated version of it, like the unfortunate and ill-fated WAP mobile web that was on older feature phones. But with smartphones getting ever larger displays and ever more powerful electronics, there really wasn't a need to have two separate webs. Standard web browsing works just fine on phones.

Problem is that even a 5.5-inch screen like the one on the iPhone 6 Plus is awfully small to take in a webpage. You can, of course, quickly zoom in and out thanks to the wonders of the effortless capacitive multi-touch, but that, apparently was a thorn in the eyes of interface developers. So we're seeing all those efforts to make sites "mobile-friendly." The currently prevailing school of thought is to have sites consisting of blocks that arrange themselves automatically depending on the size and width of a display. So if you have three pictures next to one another on a standard desktop browser, on a smaller screen the three pictures will rearrange themselves and become stacked on top of one another. Same with text blocks and other site elements.

That may seem like a brilliant solutions to programmers, but it's a hideous aesthetic nightmare in the eyes of anyone who's ever done layout and crafted pages just so. The mere idea that this could be a solution seems preposterous. So we're staying away from that nonsense.

But there are other issues. One of them is resolution. There was a time when most desktop and notebook displays used the same resolution, with every few years bringing a new standard that would them slowly be adopted. 640 x 480 VGA was gradually replaced by 800 x 600 SVGA which, in turn, was slowly replaced by 1024 x 768 XGA. Handhelds were in their own category with proprietary screen resolutions (like Palm) or the 240 x 320 QVGA of Pocket PCs.

That first changed when "wide-format" displays became popular. Where once everything had been displayed in the same 4:3 aspect ratio as TV sets, various aspect ratios quickly became an additional variable. The tens of millions who bought early netbooks will remember how the 1024 x 600 format favored on netbooks awkwardly cut off the bottom of numerous apps that were all formatted for 1024 x 768. And so on.

Then something else weird happened. While desktop and notebook displays only very slowly adopted higher screen resolutions, resolution virtually exploded on smartphones and tablets. Apple's introduced the concept of "retina" displays where, when looked at from the typical viewing distance of a class of device, individual pixels could no longer detected by the naked eye. As a result, high resolution quickly became a strategic battleground with smartphones. That led to the interesting situation where many smartphones with small 5-inch screens had the same 1920 x 1080 resolution as 15-inch notebooks, 27-inch desktop monitors, and 65-inch HD TVs. And now there are smartphones with 4k displays. That's 3840 x 2160 pixels, the same as 80-inch ultra-HD TVs.

What that means is that ordinary websites must now display in at least reasonable shape and quality on a device spectrum that ranges from tiny displays with insanely high resolution all the way to much larger desktop and laptop displays with generally much lower resolution, and often still using the old legacy resolutions.

Which is especially bad for pictures. Why? Well, let's take a look at RuggedPCReview. Ever since we started the site in 2005, all pages have been 900 pixels wide. That's because we wanted to comfortably fit into the 1024 pixel width of an XGA display. What happens when you look at the 900-pixel site on a full HD screen with its 1920 pixel width? Well, boxes and text and such scales nicely, but pictures suddenly look much worse. Now go on a 3840 pixel wide 4k screen, and the pictures are hardly viewable anymore.

So what does one do?

Turns out this is an issue that's been hotly debated for several years now, but a common solution hasn't been found. I did some research into it, and there are literally dozens of ways to make pictures look good on various sizes of displays with various resolutions. They use different technologies, different coding, and different standards, which means most may or may not work on any given rev of any given browser.

In general, how can the issue be handled? Well, you could have high-res pictures, have the browser download those, and then display them at lower resolution if need be. Or you could use on of the image formats where the picture starts of blurry and then sharpens. If all the sharpness isn't needed, the browser could simply stop the process. Or you could download various versions of the same picture an then display the one that makes most sense for a given resolution. One thorny issue is that you don't want to download a high res picture when all you need is a much lower res version. That'd be bad for bandwidth and loading speed. You also don't want to first load the webpage and then have it sit there with none of the pictures loaded while the software decides which version of a picture it should load, or while it's milling to get the picture into the optimal resolution. It's a surprisingly difficult issue.

After having read a good many articles on the issue, I was about to give up because all approaches seemed too complex to make sense for us to pursue.

But then I came across a solution that made sense. It's the "srcset" attribute that can be used with the standard HTML code for displaying an image. The way this goes is that you tell the browser to display a picture the way it always does. But the srcset attribute now also says that if the screen of the device the picture is to be viewed has such and such resolution, or is so and so many pixels wide, then use the higher resolution version of the picture! That sounds a bit difficult, but it's made easier by the fact that modern browsers know whether they run on a "1x" screen (i.e. a good old-fashioned standard language display) or a "2x" screen (like, for example, the retina iMac27), or even a "3x" display (like a smartphone with insane resolution). Which means the browser only has to download one image, and it'll be the one that looks best on that particular display. Yeah!

There's one problem, though. And it's one that can be frustratingly difficult to solve. It has to do with picture size. Anyone who is familiar with modern compact or DSLR cameras knows that there is a huge difference in the size of a "raw" image and one that's saved in JPEG format. And also that pictures can be saved at various quality levels in the JPEG format. For most web display situations, the art of the deal is to compress as much as you can while still having a decent looking picture.

How does that affect having various versions of the same picture for different types of displays? Well, if a standard picture takes 50kb of storage space, a "2x" picture will take four times as much, 200kb. And a "4x" picture would weigh in at 16 times as much, or a hefty 1.6mb. It's easy to see that this can result in serious bandwidth munching and sluggish page loading.

Turns out, the human eye is very easily fooled, and so we can cut some corners, but it has to be the right corners. Trial and error revealed that with our RuggedPCReview site, saving a "2x" size JPEG in what Photoshop considers "low" quality at a "10" level takes roughly the same amount of storage as saving the same picture in a smaller "1x" size at "good" quality, which is Photoshop's "60" level. But doesn't the picture look terrible saved in such a high compression? Amazingly, not. It looks great, and much sharper than the higher quality low res picture. That means that while we still must create two versions of each picture, loading a page with a high-res picture on a high-res display takes no longer than loading the low-res picture!

That sounded too god to be true, but we tried it and it works. So from now on, whenever possible, all pictures used in new RuggedPCReview ages will have low-res and high-res versions of images.

Isn't technology great!?


Posted by conradb212 at 06:26 PM | Comments (0)

February 23, 2016

Cat S60 — More than the naked eye can see

They used to say, and likely still do, that a picture is worth a thousand words. That's certainly true, but it can also be quite misleading as pictures often tell a story rather than the story. There can be quite a difference between these two. The media is very adept at using carefully chosen pictures that tell a story that may or may not be so, or present the story with a slant or an agenda. One could almost say that a picture can tell a story in a thousand different ways. And in the age of Photoshop (generically used here; any image program can do), that's more true than ever. And let's not even talk about videos.

There is, however, another aspect of pictures. Most only tell what the human eye can see. Light is electromagnetic radiation, and only a small part of it is visible to the human eye as colors. It's the part with wavelengths of roughly 380 to 750 nanometers. Below that is ultra-violet and then x-rays and other rays. Above that first infrared, then microwaves and radio waves.

I've always been interested in the spectrum beyond what we can see.

My initial degree was in architecture, and that meant understanding the principles of heating and cooling as well as energy conservation. While we humans primarily feel temperature, temperature can also be seen. Technologies that make infrared wavelengths visible to the human eye can show us temperatures as colors.

As an enthusiastic scuba diver I learned the ways light behaves underwater. Colors are different underwater because waves travel differently, and some wavelengths are filtered out by water sooner than others. Lower energy waves are absorbed first, so red disappears when a diver reaches a depth of about 20 feet. Orange disappears next, at around 50 feet. Then yellow at about 100. Green stays longer and blue the longest, which is why things look bluer the deeper you go. But it's not always like that. For example, I found that sometimes I could see red at depths where it was not supposed to be visible. I wrote about that in Red at Depth.

The image below shows the same coral head at a depth of about 90 feet without artificial light on the left, and with the flash on the right. Without the flash, the red on the left ought not to be visible at all. And yet it is.

Then there's the interesting phenomenon of fluorescence. Fluorescence essentially describes the physical process of a substance absorbing light at the incoming wavelength, and re-emitting it at a different wavelength. Almost everyone has seen the ghostly effects of "black light" that makes some materials glow brightly while leaving others unaffected. Through scuba I found that there's a totally fascinating world of fluorescence under the sea. I described that in Night Dives Like you've Never Experienced Before.

The image below shows an anemone we photographed with a yellow filter and a NightSea fluorescent protein flashlight. In normal light you'd barely see the animal, but it is strongly fluorescent and lights up under certain incoming wavelengths.

Having founded Digital Camera Magazine in 1998 gave me an opportunity to witness the progression of digital imaging and also the opportunity of hands-on with a large number of different cameras and lenses. That knowledge and experience not only in cameras but also the underlying imaging technologies led to an interest in emerging uses and applications. That included explorations in ultra-slow motion imaging for science projects with my son, and examining the emerging action photography and videography market (it's safe to say that I helped GoPro understand how light behaves underwater; see The GoPro phenomenon: what the world-beating little 1080p vidcam can (and cannot) do).

Below you can see me testing different color filters in a Northern Florida spring with a specially built rig with two GoPro Hero 3 cameras.

In my work as publisher of RuggedPCReview and before that Editor-in-Chief of Pen Computing Magazine, I came to appreciate thermal modeling and design in microelectronics where proper cooling and removal of heat generated by the processor and related circuitry is among the most important aspects of any mobile computing design.

That's where infrared imaging comes into play. Human eyes and normal cameras cannot see infrared. Older generations will remember that infrared, commonly referred to as IR, was widely used for data communication between computers and devices, and it is still used in many remote controls. Older audiences will also remember the "Predator" movies where those merry human-hunting aliens saw the world in infrared. Infrared, or thermographic, cameras have been available for decades, but at a high price.

Recently, a company named FLIR Systems changed all that with a series of much lower priced thermographic cameras as well as the FLIR One thermal imaging camera module that snaps onto an iPhone or Android device. Not having a review relationship with FLIR, I pre-ordered FLIR One for my iPhone 6 Plus, and it arrived late 2015.

The FLIR One is an absolutely revolutionary product in that it lowers the cost of very functional thermal imaging to around US$250. The way the FLIR One works is that it shoots both a thermal image and a conventional picture. Why the conventional picture? Because thermal imagery doesn't provide much physical detail and it can be difficult for our eyes, unaccustomed to thermal data, to interpret the image. So what the FLIR One does is extract line-drawing type of detail from the conventional image and then merges it with the thermal image. That makes it much easier to exactly see what the thermal data pertains to. The FLIR One does all of that automatically.

When you show an IR picture to an uninitiated person, they will almost always assume it's just a Photoshop filter applied to a regular picture. But that's definitely not so. The thermal camera records data readings and then displays those on a color scale. FLIR One users can select from various color schemes. Since most people associate blue with cold and red with hot, I usually use the blue-red scheme.

What can you use the FLIR One for? Well, the applications are almost endless. The architect in me began a thermal imaging review of my home, identifying the efficiency of insulation and the presence of leaks. The scuba diver in me donned full scuba gear and examined hot and cold spots as those can definitely be an issue on a deep cold-water dive. And the reviewer of advanced rugged mobile computing products in me, of course, instantly began examining the thermal properties of gear in our RuggedPCReview testing lab. It's fascinating to see the heat signature of a mobile computing device, how it changes as the device heats up, and get an overall idea of how well the designers and engineers handled the thermal aspects of their products.

Below are some example of images taken with the FLIR One iPhone module. The image on the left shows a heater floating in an iced-over Koi pond. Dark blue is the ice on the surface of the pond, orange and yellow the warmer water kept ice-free by the heater. The image on the right shows the thermal design of a RuggON PX-501 tablet (see our full review). Yellow shows the heat-generating CPU and ancillary circuitry, as well as the copper heat conduit to the fan.

The two pictures below help architects, home owners and contractors. On the left, it's obvious which part of an attic room needs better insulation. On the right, you can literally see the cold air wafting in between two windows on an icy December day.

Is the FLIR One perfect? Not yet. While quadrupling thermal resolution over earlier low-cost efforts, it's still only 160 x 120, far, far less than the typical 8-16 megapixel recorded of the visible light spectrum. You don't need nearly as much resolution to convey useful thermal imagery (640 x 480 is considered high-res) and so the current low res is not a big problem. And now that FLIR has gained a foothold with the FLIR One and similar offerings, we'll likely see higher resolutions very soon.

But my story doesn't end here. In fact, you could say everything above is just an introduction to the news I had wanted to write about, the Cat S60.

Normally, we consider ourselves pretty well informed about anything in the rugged computing industry, but the Cat S60, officially introduced February 18, caught us by surprise. What is the Cat S60? Cat? Yes, Cat as in Caterpillar! Caterpillar actually sells a line of rugged smartphones on catphones.com. They all look suitably rugged, they all sport the familiar Cat logo, and their design language is reminiscent of Caterpillar's earthmoving machinery.

What immediately came to mind were branded signature edition of rugged hardware, like the "Hummer" version of one of the rugged Itronix notebooks several years ago. So did Caterpillar actually also make smartphones? Not really. Turns out that the Caterpillar phones come courtesy of UK-based Bullitt Group, a privately held technology company that apparently works with a number of tech companies. From what I can tell they either license famous brand names or work in joint ventures. As a result there are JCB phones (JCB is a British heavy equipment manufacturer), Kodak Phones, and Bullitt has even licensed the Marconi brand from Ericsson to launch a range of radios named after the very inventor of radio. Bullitt does about $100 million in sales a year, not tremendous, but very respectable.

What's really interesting, though, is that the Cat S60 is not just a rugged smartphone built to benefit from the Caterpillar image and name. No, it's actually what Bullitt calls "the world's first thermal imaging smartphone" and it has a built-in FLIR imaging camera. So you get thermal imaging built right into your rugged smartphone. The Snapdragon 617 octa-core powered Android Marshmallow phone itself has a bright 540 nits 4.7-inch procap display that can handle wetness and gloves. Ruggedness specs are quite impressive with a 6-foot drop, and what appears to be IP68 sealing. The Cat S60 is said to be waterproof down to 17 feet for an hour, and its 13mp documentation camera can supposedly be used underwater (see Catphones media release).

Wow. That is impressive indeed. Having a rugged smartphone with integrated thermal imaging capability opens up entirely new applications and functionality in any number of areas. How is this possible? With FLIR's tiny Lepton longwave infrared sensor that's smaller than a dime. For those interested in all the detail, here's the full FLIR Lepton datasheet in PDF format. Resolution of the initial Lepton imager is limited to 80 x 60 pixel, the same as in FLIR's iPhone 5 camera module, and entirely adequate for thermal imaging on a small smartphone screen. How much does the CAT S60 cost? US$599. Which seems almost too good to be true.

This is all very exciting. I don't know what Caterpillar's reach is in phones, given that we had actually never heard of their phone operation. Then again, I yesterday had a meeting with my landscape architect, and the man had not only heard of, but was very interested in the new Cat S60. I can absolutely see how offering rugged handhelds or tablets with an integrated FLIR Lepton camera can be a strategic advantage, especially if bundled with thermal imaging demos and apps. And having that kind of functionality in a product would not only be of great interest to many customers, but also definitely gold for marketing.

Posted by conradb212 at 03:28 PM | Comments (0)

February 15, 2016

Keeping an eye on the level of technology offered in consumer tech: Dell Venue 8

The consumer market is really, really tough. Sure, massive fortunes can be made off it thanks to the sheer size of it, and thus the potential of millions of units sold. But few products ever make it into that sales stratosphere, and the competition is brutal. Make one mistake, be it in technology, manufacturing, marketing or just about anywhere else, and the product tanks, expensively. Add to that the fickle taste of consumers, the unpredictability of trends, a lightening-quick product cycle pace, and the true successes are few and far between. Leaving some very good and often excellent products behind. Or at least underappreciated.

That all came to mind as I spent the weekend playing with my latest impulse buy, a 7000 Series Dell Venue 8 tablet. First available early 2015, the Venue 8 is an 8.4-inch consumer tablet that's part of Dell's efforts to establish itself in mobile technology. That effort had seen the short-lived Dell Venue smartphones and then a re-introduction of the line late 2013 as tablets. Venues tablets are available both in Android as well as Microsoft Windows versions, the latter as the Venue Pro.

So why did I get a Venue tablet? In part because I've always liked Dell. I liked the old Dell Axim handhelds of the Pocket PC era, I like the various rugged Dell tablets and notebooks we've tested here at RuggedPCReview.com, and I liked Dell's decision to take the company private so as not to be at the mercy of Wall Street analysts whose quarterly growth expectations must be met lest they "worry" or, worse, become "concerned."

In this instance, I received an email alerting to a special deal on the Venue 8. I decided to check it out as the trusted Google Nexus 7 I'd been using as my personal small Android tablet, and also as a point of reference whenever we test a rugged Android device, had outlived its usefulness. After the latest OS upgrade — which to Google's credit was always quickly available for the Nexus 7 — the device had become so sluggish as to be useless. Could I not just ask one of our sponsors for a long-term loaner? I could, but I always like to have something that's truly mine and not subject to a sudden unexpected recall for inventory purposes or such.

Add to that that the deal was very sweet. Just US$199 for a 16GB Dell Venue 8 7840 running Android 5.1 (Lollypop). That's $200 off the regular US$399, and shipping included. Fast shipping, too, as the package arrived at my doorstep just a day and a half after I ordered the tablet.

Now, since I am writing this for the RuggedPCReview.com blog, let me make something clear right off the bat: the Venue 8 is NOT a rugged device. It's of the standard consumer/business variety that Dell has always specialized in. So why the write-up if it's not a rugged device? Because it's always good to see what consumer technology is up to and what consumers expect, and get, for their money. With the massive global reach of smartphones and tablets, what consumers expect from their personal gear has a direct impact of what they expect from rugged gear. So there.

If that's the case, and according to our experience it is, then every manufacturer of rugged mobile computing gear should get a Venue 8 and study it. Because consumers get an awful lot of very advanced technology with this tablet, even at its US$399 list price.

First, there is the 8.4-inch display with massive 2,560 x 1,600 pixel resolution. That's 359 pixels per inch and incredibly sharp. It's an OLED (organic light emitting diode) screen that's also vivid with deep blacks and intense colors. The display has absolutely perfect viewing angles from any direction. Brightness ranges, depending on what review you want to believe, from 250 to 430 nits. I consider it very bright. The folks at Anantech actually didn't like the display very much in their review of the Venue 8 (see here). But compared to most displays in rugged devices, it's very, very good.

For a processor, the Venue 8 has a quad-core Intel Atom Z3580 with a maximum burst frequency of 2.33GHz. The Z3580 is part of the "Moorefield" lineup of Atom chips and designed specifically for smartphones and tablets. It's different from Bay Trail chips in that it's not using Intel HD Graphics, but a PowerVR Rogue G6430. There's 2GB of RAM and 16GB of eMMC mass storage, plus up to 64GB via externally accessible micro SD card. There's Bluetooth 4.0 and fast 802.11ac WiFi via an Intel 7260 WiFi + BT 4.0 module. And the 21 watt-hour battery is supposed to last 9.5 hours.

The Venue 8 has not just two, but four cameras. Yes four. There is the standard frontal conferencing cam (2mp), there is the 8mp rear-facing documentation camera, and then there are two supplementary 1mp cameras that work in conjunction with the 8mp to allow depth measurement as well as adjusting focus on a picture after it's been taken. You don't see that anywhere else.


The whole thing is packaged into an anodized aluminum case that measures 8.5 x 4.9 inches and is less than a quarter of an inch thick. This tablet makes the sleek iPhone 6 look a bit stout. Weight is 10.75 ounces.

So how well does the Venue 8 work?

Very well. The Venue 8 has a exceptionally high quality feel to it, in that Apple-esque way that makes it feel like the device is milled from a solid block of metal. And the OLED display is just gorgeous with its rich, vibrant colors and deep blacks. It's like looking at a particularly good Plasma TV compared to regular LCD TV. I/O is minimal. There's the tiny micro-USB jack for charging. There's the micro SD card caddy. A headphone jack. And then the small volume rocker and power switch.

What's a bit unusual is the asymmetrical layout with 3/16th of an inch bezels on three sides and then a much heftier 1-3/16 on the 4th. That's good news for the speakers which get much more real estate than they usually get in a small tablet, and they face forward for good sound. That makes for a largish area where to hold the tablet, but both the front and the rear main cameras are also located there, which means it's easy to inadvertently cover them. Overall, I like the arrangement.

In terms of performance, the Venue 8 is quick. Overall, few of the Android devices that I've tested or worked with are as smooth as anything that comes out of Apple, with a degree of stuttering here and there common. There's very little of that on the Venue 8. It's generally quick and responsive, and a pleasure to use.

Literally billions are familiar with Android now, which means that whatever quirks Android has — and it still has its fair share of them — do not affect its popularity. The vast variety of Android hardware and the numerous versions of Android itself, however, still mean that some apps are either not available for a particular device or version, or they are not optimized for your particular device.

Some reviewers have complained about very small text and icons on the Venue 8 due to its very high resolution on a fairly small display. I did not find this to be an issue under Android Lollipop, and certainly much less of an issue than it is on most small Windows tablets.

The "depth" camera assembly with its main 8mp camera flanked by two 1mp complementary cameras has me baffled. The idea here is that the two subsidiary cameras allow to capture depth information that can then be used to do a variety of things to a picture, like focus on certain areas, applying filters to certain areas, or even measuring distances and areas. It can also be used to widen or narrow the depth of field for artistic purposes.

Unfortunately, this is only marginally documented, and didn't work all that well for me. On top, the 8mp camera itself isn't nearly as good as most people have come to expect from their smartphone cameras. So that's a disappointment. It does, however, still work better than most cameras in rugged systems. I understand the need to differentiate a product from the competition, but in this instance I'd have preferred one really excellent documentation camera instead of the triple camera experiment.

Battery life is excellent, especially considering there's only 20 watt-hours, and the device is less than a quarter inch thick. Battery life is so good that, like in iPads, it simply ceases to be an issue.

So what does it all mean as far as rugged computing is concerned? Since it's not a rugged device, nothing directly. However, the Venue 8 demonstrates the level of technology and features that's available to consumers for very little money. And the impressively high quality. The vibrant OLED display with its very high resolution. All for US$399 list, or the ridiculously low US$199 on special.

And what does it mean as far as manufacturers of rugged tablets are concerned? Simply that the consumer market is spoiling consumers with advanced technology that's hard to match in low-volume ruggedized gear with much longer product cycles. So I don't expect to find all this enticing high technology in rugged computing products for the job. But it definitely IS good to keep an eye on what consumers are getting for very little money. Because those consumers then want the same in their professional gear.

Posted by conradb212 at 07:55 PM | Comments (0)

January 19, 2016

Follow-up on iPad Pro and Apple Pencil

I've now had the iPad Pro for a good couple of months and the Apple Pencil for a month and a half. How do I use them? Have they changed my life?

As far as the iPad Pro goes, it has totally replaced my iPad Air 2. I don't think I've used the Air 2 once since I got the Pro. However, I am doing the exact same things on the Pro that I used to do on the smaller Air 2. The split screen functionality is not good or compelling enough to really work with two apps at once, and it's nowhere near universally supported.

So I use the Pro just as a larger iPad. Although the Pro is significantly heavier than the Air 2, and almost a bit unwieldy, apparently the bigger screen and the fact that it's a good bit quicker than the Air 2 are enough for me to use the larger Pro.

I'm disappointed that there really are no apps that are truly "pro" in the sense that they add undeniable value to a larger device, make it a professional tool instead of just the device that the iPad always has been. For now, there really is no difference.

How about the Apple Pencil? As someone who has worked with and written about pen computing technology for over 20 years, I should be a primary candidate for the Apple Pencil. I should be thrilled that Apple is finally recognizing the pen as the important productivity tool it can be.

But I am not.

I played around a bit with the Apple Pencil when I first got it, but haven't used it since. That's not because I am no longer a fan of pens. It's because Apple just didn't get it right. The Apple Pen is too large, too slippery, and too poorly supported. You never know if an app will really support it or just part of what the Pencil can do.

And having a pen with a battery is just unappealing, especially when its primary charging mechanism, to stick the pen into the Pro's Lightning connector, is just too bizarre. As is, when I look at the Pencil and feel like I want to try it again, the first thing that comes into my mind is that it probably needs charging first. And I move on.

Add to that the fact that there's no garage for the pen on the big Pro, and the $99 Pencil seems almost like an effort by Apple to emphasize Steve Jobs' point: we don't need a pen!

All this baffles me. I really wanted to like the Pencil. But the way Apple went about it is like Microsoft went about the Kinect. An expensive add-on that shows flashes of brilliance, but overall just doesn't work well enough for people to want it.

Posted by conradb212 at 10:04 PM | Comments (0)

December 06, 2015

An assessment of the Apple Pencil

A few weeks after the Apple iPad Pro began shipping, the Apple Pencil is now available also. This is big news because it was Apple who finally made the tablet form factor a success, and they did it without a pen. Which is remarkable, as tablet computers initially were conceived specifically as a modern equivalent of a standard notepad that you wrote on with a pen. And remarkable again as Steve Jobs was adamantly opposed to pens and often listed his reasons why he felt that way.

But now the Apple Pencil is here, a good 5-1/2 years after the iPad was first introduced. Why did Apple do it? Perhaps it's because Samsung has been selling tens of millions of their Note tablets with pens. Perhaps it's because being able to do quick illustrations or annotate text documents with handwritten notes simply makes too much sense to ignore. Perhaps it's a tacit acknowledgment that fingerpainting is limiting when it comes to anything but the simplest of artistic expression. Perhaps it's because the big iPad Pro simply needed something to differentiate itself from smaller and lesser iPads. Or perhaps it's all of the above, or something else entirely. Fact is, the Apple Pencil is here.

The big question now is how, and how well, the Apple Pencil works, and what it might mean for Apple. After all, the pen is what Jobs felt was so unnecessary.

A brief history of using pens with tablets

But before I go into my own opinions and experiences with the Apple pen, I want to outline the big picture. As stated above, tablets were initially conceived as a modern day replacement of pen and paper. And they've been around for over a quarter of a century. Contrary to what a lot of people believe, Microsoft did not invent the tablet in the early 2000s. Already back in 1989, the Momenta tablet was available, and it sparked great excitement over a future where tablet computers you could write on with a pen would replace the conventional PC.

In the early 1990s, every major computer company offered a tablet. At a company named GRiD, Jeff Hawkins (who would later invent the Palm Pilot) designed the GRiDPAD and the PalmPad. NCR had the NotePad, Samsung the PenMaster (see below), Toshiba the DynaPad, IBM the ThinkPad (1993 IBM ThinkPad 730t shown on the right), and there were many more, all tablets operated with a pen. Many of those tablets initially ran the novel PenPoint operating system specifically designed for tablets and use with a pen.

Unfortunately, while there was tremendous hype around those early 1990s tablets, they failed to become a commercial success for a variety of reasons. One was that the technology just wasn't there yet to create a tablet light and functional enough to be of much use. More importantly, supporters of the tablet concept back then took the idea of electronic pen and notepad too literally. Their central thought was that everyone knows how to use a pencil, and everyone knows how to write. So we simply use computing power to translate handwriting into text. The tapping, panning, pinching and zooming of modern tablets simply never entered the equation because a) it wasn't possible back then, and b) people didn't do that on pen and paper pads, so why do it on a computer?

As a result, early 90s tablets completely relied on pens. If the pen failed to work or was lost, there was no touch functionality as a backup. The tablet became useless. Likewise, if the computer failed to understand the handwriting or gestures written on it, which was often the case, it was useless.

That quickly dampened the enthusiasm for tablets, and the fact that Microsoft fought against the new pen-centric operating systems tooth and nail didn't help either. It was a war Microsoft won. Its token-effort "Windows for Pen Computing" prevailed against the far more innovative PenPoint OS.


That did not mean Microsoft had no interest in pens. After the Apple Newton failed in the mid-90s due to its exclusive reliance on handwriting recognition, Microsoft's own Windows CE achieved modest success on small handhelds that primarily used the pen as a mouse replacement. Microsoft then followed up with its 2001/2002 Tablet PC initiative (NEC LitePad shown on the right) that used active pens with significant built-in support from the Windows XP Tablet PC Edition. Handwriting recognition and gestures were available, but hardly played a role at all.

The Microsoft-spec Tablet PC failed because, again, it used the pen simply as a comparatively clumsy mouse replacement on Windows, an OS completely designed for use with a mouse. Plus, it was all too easy to lose the (still expensive) pen, and there was no intuitive finger tapping, panning, pinching and zooming as a backup and complement for the pen. Small wonder that Microsoft itself switched its emphasis to convertible notebooks instead of pure tablets before the "Tablet PC" was even officially launched.

Apologies for the long history here, but it's necessary to understand all this when assessing the Apple Pencil. Has Apple learned from history, or will the Pencil fail because it's making the same mistakes all over?

Has Apple learned from pen history?

So given all that, given that Apple showed the remarkably prescient "Knowledge Navigator" almost 30 years ago, given that Apple had the Newton over 20 years ago, and given that Apple has all the engineering and human interface design expertise in the world, how did Apple implement the new Apple Pencil for the iPad Pro? And what kind of technology did they use to make it work?

The first thing you notice about the Apple Pencil is that it's very long. A full seven inches. Most writing implements are about five inches long although, in fairness, an actual lead pencil is also about seven inches long. Still, it's not clear to me why Apple made the Pencil that long.

Most likely the space is needed for the electronics inside. Apple is very good at miniaturizing everything, but the Apple Pencil is a remarkably complex piece of equipment. The folks at fixit.com tore one down (see here) and found not only a fairly long rechargeable (but non-replaceable) 0.33 watt-hour Li-Ion battery, there's also a full logic board with an ARM Cortex-M3 processor, a Bluetooth chip, a variety of sensors and more. There's an awful lot of tech in there, and so for now Apple perhaps simply needed that much space.

Yes, there's a battery. Which means the Apple Pencil must be charged. This in complete contrast to the active pen technology that has ruled supreme since the early 1990, Wacom. Slender, lightweight Wacom active pens have been around since the very first pen tablets, and they've beaten all competing technologies over the past 20+ years. Microsoft pretty much built its 2002 Tablet PC around the Wacom pen. The image below shows the Apple Pencil and the Wacom pen used in a 2003 Toshiba convertible tablet.

Wacom's success was, for the most part, because the Wacom pen does not need a battery. Every other active pen technology does. Microsoft's N-Trig pens used with its Surface tablets need a battery. Since the battery in the Apple Pencil is non-replaceable, how is it charged? Amazingly and almost unbelievably, this way:

Can anyone say "recipe for disaster?" This has got to be one of Apple's worst ideas ever. It's in the "What were they THINKING?" category. The combination of a fragile Lightning connector and a seven inch lever sticking out from it is virtually guaranteed to eventually result in damage.

Fortunately, the Apple Pencil comes with a tiny little (and thus eminently losable) adapter so you can also charge it via a standard Lighting-to-USB cable.

Now how about two of the other big problems with pens in the past, those being cost and losing the pen? The news is not good here. At US$99, the Apple Pencil costs more than a low-cost tablet and it's certainly not a throw-away item. It's also slippery, doesn't have a clip to attach to a coat pocket (for which it'd be too long anyway), and thanks to the super-slender design of the iPad Pro, there's no garage in the tablet, or any other way to attach the pen to the tablet. Apple should have come up with some sort of solution for that.

Apple Pencil technology

Now what about the technology? Wacom technology works with a powered digitizer grid that can sense the proximity and location of the pen behind the LCD. Resistive pens need actual touch. Capacitive pens in essence emulate a finger tip which changes the capacitance between two electrodes. Experts I have talked to have long thought that a combination between the Wacom (or another active pen technology) and capacitive touch would be the ultimate answer to adding fine detail movement to capacitive multi-touch. But some have recently changed their opinion and now see advanced capacitive pens as the likely winner. An example of the latter technology is the superb capacitive pens that come with some Panasonic tablets.

Apple, however, apparently took a different approach. The Apple Pencil communicates with the tablet via Bluetooth. Which means Bluetooth must be on, with all that that entails. How exactly the Pencil works I don't know yet. I've seen sources that say the Pencil's technology is such that the iPad knows whether a touch is from the Pencil or from a finger, therefore making it possible to process Pencil touch in very different ways. Add to that the sensors present in the Pencil, and things like the much heralded "shading" become possible when the Pencil is held at an angle.

Clearly, wherever there is much tech involved, amazing things can be done, and the Apple Pencil is headed that way. But amazing things generally need a lot of support, and that support will take a while to materialize. As is, the Apple Pencil can be used to tap and pan anywhere, and it works in virtually all applications that accept inking. But the sexy shading requires special support, and for now it's a trial and error process figuring out which apps and tools support what.

If you search for "Apple Pencil Support" in the app store, nothing comes up. If you look for "Apple Pencil apps," a large number show up, but it's not clear how and to what degree the Pencil is supported.

How well does the Apple Pencil work?

How well does the Apple Pencil work in general?

Quite well. Playing with various drawing and sketching apps and exploring the tools with the Apple Pencil is a pleasure. Ink goes on smoothly and elegantly, and it's easy to see what the Pencil can do in the hands of an artist who invests the time to learn all aspects of all the tools. We're certain to see some amazing artwork created with the Apple Pencil.

However, pretty much anything I managed to do with the Apple Pencil I've been able to do for many years with a 20-year-old Wacom pen. Using a Wacom pen on a Microsoft-style Tablet PC delivers wonderfully smooth ink and virtually no lag, same as the Apple Pencil. Those older pen, too, have erasers, pressure sensitivity and they can do amazing tricks. So for now I cannot exactly view the Apple Pencil as a stunning leap forward.

The Apple Pencil hasn't resolved some old pen technology issues. Even Apple claims "virtually" no lag on its own promos, as there still is a bit of lag. The human mind has a very firm expectation of real-time versus lagging response (think of how weird even a slight lip-sync delay is in video and movies), and with rapid movement of the Pencil, there's definitely a sense of delay.

Worse, for me, is the on-off delay in very small movements, like when an artist wants to add minute tiny touches here and there. In the apps I tried that with, there often was a slight delay before the touch registered. What that means is that with a real pencil you absolutely always know what to expect. For now, with the Apple Pencil, not so much.

Finally -- and that's another area where the pen-and-paper metaphor breaks -- when we write on paper with a pencil, we expect the relatively coarse feel of paper. When we pan and swipe on a modern tablet we want as little "stiction" as possible for best operation. Alas -- you can't have both. As is, between the thickness of the glass on the tablet and its super-smooth feel, working with the Pencil feels like writing on a piece of glass. The brain doesn't like that. The brain expects the friction. Likewise, when our eyes look at the Pencil operate from an angle, the brain says, "Wait a minute! I can feel the Pencil touch the surface, but the writing is a fraction of an inch away from the tip!" That's parallax, of course.

Handwriting recognition

It's hard to imagine that there was a time when there was fierce competition in the handwriting recognition software market, with various approaches and technologies seeking to address all sorts of writing needs. At one point I had nearly a dozen of different recognizers working on a Compaq Concerto tablet convertible that served as my main mobile system for a while in the mid-90s. Today, recognition is essentially a non-factor, what with so many more people keyboard-savvy, and onscreen keyboards working so well.

But handwriting recognition software is still available. In fact, the roots of my favorite app, PhatWare's WritePad, go way, way back to the beginnings of handwriting recognition, and the technology has been refined ever since. It works beautifully on the iPad Pro.

Whether or not handwriting recognition will make a comeback is anyone's guess. Between the maturity of the software and the vastly improved capabilities of hardware it certainly works very well. One thing I found distracting is the clicking noise that the hard pen tip on the glass surface of the iPad Pro makes when handwriting. That was never an issue with the soft display surface of the Apple Newton, but such are the pros and cons of various technologies.

Apple Pencil: Too early to tell

Having worked with pens and tablets for 25 years, I want to like the Apple Pencil. I do believe a pen can greatly add to the iPad experience. But for now, and that is so highly unlike anything from Apple, what comes to mind is "the good, the bad, and the ugly." I love how well the Pencil works and the promise it shows. I am not fond of the price, the length, the ease of losing it, the battery, the ungainly looking tip and the uneven app support. And I am baffled that Apple thinks sticking the Pencil into an iPad port to charge it is even remotely a good idea.

So for me, the jury on the Apple Pencil is still out. But I am certainly glad Apple made it available. -- Conrad H. Blickenstorfer, December 2015)

Posted by conradb212 at 06:36 PM | Comments (0)

November 20, 2015

Will the Apple iPad Pro herald an era of "pro" use of tablets?

My iPad Pro came in and I want to share my first impressions, and also my thoughts on where all this is leading.

Anyone who first sees the iPad Pro will immediately notice its size. The iPad Pro is big. 12 x 8.7 inches versus 9.4 x 6.6 inches for the iPad Air 2. So the iPad Pro's footprint is 68% larger than that of the standard iPad. Amazingly though, at 0.27 inches the iPad Pro is barely thicker than the iPad Air 2, which is 0.24 inches thick. And even more amazingly, the big iPad Pro is actually thinner than the sleek and slender iPhone 6S Plus (0.29 inches). In terms of weight, the iPad Pro comes in at a very manageable 1.57 pounds, just barely more than the original iPad, which weighed a pound and a half.


That said, its sheer size remains the iPad Pro's most obvious feature. The picture below shows the (already rather large) 5.5-inch iPhone 6 Plus, the 9.7-inch iPad Air 2, and then the new iPad Pro with a diagonal screen size of 12.9 inches.

What about some of the other display specs? The 5.5-inch iPhone 6S Plus has 1,920 x 1,080 pixel resolution, which means 401 pixels per inch (ppi). The 9.7-inch iPad Air 2 weighs in at 2,048 x 1,536 inches, for 264 ppi. And the new 12.9-inch iPad Pro sports 2,732 x 2,048 pixels for the same 264 ppi. Super-sharp, all of them. And how bright are the displays? The iPhone 6S Plus is around 500 nits, the two iPads both around 450 nits. Plenty enough even for occasional outdoor use.

Setting up the iPad Pro was as simple as it gets. I backed up my iPad Air 2 to my iMac, then went through the setup process for the iPad Pro and selected "restore from backup." That took less than 20 minutes. Then the iPad Pro came up and looked just like my iPad Air 2. But not totally. Only about 2/3s of my apps were loaded, the rest I had to restore from the Apple Store myself. And whatever required a password on the existing iPad required the password again on the iPad Pro. Else, everything was there. Kindle opened the book I was reading on the same page, and every webpage I'd had open on the iPad Air was also open on the iPad Pro when I first launched Safari. This ease of upgrading is something that I always loved about Apple.

I had feared that some apps might look terrible on the larger screen, the way iPhone apps looked terrible on the original iPad which had not only a much larger screen, but also much higher resolution. Since the resolution of the iPad Air 2 and iPad Pro is the same, nothing looks unusual, but many apps look sort of too spread out on the much larger screen.

You do get used to the larger display size very, very quickly. After a couple of hours of checking out apps and just doing my regular iPad routine, the Pro, while still feeling unquestionably large, felt right, whereas the iPad Air suddenly seemed to have shrunk.

But does one really need such a large screen? As is and for now, it seems like a luxury. The vast wealth of existing apps have all been designed and optimized for the 9.7-inch iPad display, so it's not as if using the smaller screen was an inconvenience. Not in the way running classic Windows, designed for a big desktop display, can be a challenge on a small mobile device screen.

Where the larger screen will certainly come in handy is in split screen operation. Splitting the screen on the 9.7-inch iPad made for a lot of eye squinting and panning around. On the iPad Pro, working with two apps feels almost like having two iPads side-by-side. The picture below shows the iPad Pro in landscape mode with Excel and Safari side-by-side. Each half is plenty large enough for comfortable viewing.

Problem is, there aren't many apps that support the full split screen yet. By full I mean 50-50 instead of just 2/3-1/3 (sort of like Microsoft introduced with Windows 8) that many apps are limited to. And sifting through an alphabetically ordered column of apps to pick the one you want in a split part of the screen is hardly the best way to get things done.

Another issue is that apart from the bigger size, there isn't really anything "pro" about the iPad Pro yet. I searched for "iPad Pro" in the Apple store, but couldn't find anything truly seemed to be for the iPad Pro, or take advantage of it. Not yet anyway.

One truly weird thing is that splitting a screen into two is sold and lauded as a marvelous technological advance. What?? For the past 30 years we've had as many resizable windows to work with as we wanted and needed, and now splitting the screen into two is news? No way.

There is, of course, the pen. Sadly, I'll have to wait for mine another three weeks or so, although I ordered it together with my iPad Pro. I did get a chance to play with it in the local Apple store. There's good news and not so good news.

The not-so-good news is much the same that sank the first generation of pen computers in the early 90s and limited adoption of the second generation in the early 00s. The Apple pen (Apple calls it "Pencil") is very expensive (US$99), very large (bigger than a full-size pencil and with an oddly fat tip), and very easy to lose (it's slippery, has no clip, and there's no garage for it on the iPad). Worse, it needs a battery, which here means recharging it via the iPad's port where it looks like it could very easily get damaged. And anything that must be charged will inevitably come up dead when you need it most.

All of the above, and a few other reasons, is why Steve Jobs was adamantly opposed to pens.

The good news, though, is that the pen works very well. Though the one I tried had somehow disconnected itself from its iPad and needed genius intervention, once the pen and tablet talked, the pen worked as smoothly and effortlessly as can be. Pressure results in thicker or thinner lines (hence probably "pencil"), the pen never falls behind, and the much advertised shading when you hold the pen at an angle is marvelous indeed. If all of this sounds the Apple Pencil is mostly for artists, that may or may not be so. I still love to jot quick notes on whatever scrap of paper is around, and with the right app scribbling away on the iPad Pro could be a boon. The question then is whether that's enough of a use to justify a thousand dollar tablet and a hundred dollar pen.

Few remember today that handwriting recognition was the central concept of the first generation of pen computers. The idea was that everyone knows how to write, whereas not that many could type very well. So let's replace that clunky old keyboard where you have to learn the layout of the keys with a simple pen. The computer then recognizes the writing, and no more need for a keyboard, not even onscreen. Sadly, that never really worked out. It worked for a few (including me), but no computer was fast and smart enough to make sense of the kind of careless scribbling most of us commit to paper. And editing via pen was a pain, too, almost as much as editing via voice (which is why pure voice recognition also isn't winning any popularity contests).

But might useful handwriting recognition be a reason to have the Apple Pencil? That's quite possible. Apple owns excellent intellectual property in the form of the "Rosetta" recognition technology of the late Newton MessagePad that became available as "Inkwell" in the Mac OS. Whether or not this will amount to anything on the iPad with its quick and easy tap-editing is anyone's guess.

Final question: what impact will all of this have on rugged tablets? After all, Apple will likely never make a rugged iPad, and although many rugged cases are available for iPads, more than likely Windows and Android will remain the prevalent OS platforms in rugged computing devices.

The answer, I think, is that anything relating to screen size is worth watching. Trends towards smaller and larger display sizes in certain classes of mobile devices have always been quests for strategic advantage as much or more than technological progress (try to sell a phone with a 3-inch screen today!). And with both Microsoft and Apple (let alone Samsung) now using pens, pens may well regain a much more prominent position in mobile devices.

In closing this article, let's not forget that we're still very much in the midst of a software interface quagmire. Most of today's productivity software was created for use with a mouse on a desktop. Yet, it's a mobile world now where touch has replaced the mouse. Unfortunately, while panning, tapping, pinching and zooming work great in media consumption apps, they lack the precision mouse-era productivity software still requires. And that, my friends, is where the word "pro" ought to come into play. It's not so much the size of the screen as it is how to work on it. And that issue remains to be resolved.

Posted by conradb212 at 04:52 PM | Comments (0)

September 18, 2015

What led to the Universal Stylus Initiative

A short while ago I received a press release from the Universal Stylus Initiative. I filed that away in my mind, but got back to it because the concept certainly sounds interesting. Having used, described, tested and compared numerous pen and touch technologies over the past two decades in my work first at Pen Computing Magazine and then at RuggedPCReview, I definitely consider it a relevant and increasingly timely topic (witness Apple's announcement of the iPad Pro with a pen!).

So I spent some time thinking things through and figuring out the need for a universal stylus initiative.

The great appeal of tablets and smartphones, of course, is that they provide tremendous communication and computing capability in small and handy packages that can be taken anywhere. That's in part possible because they don't have physical keyboards that add weight and get in the way. The trade-off, of course, is that without a physical keyboard and a mouse it isn't as easy to enter a lot of data or easily operate the computer.

That's where touch and pens come in. Early tablets (yes, there were tablets in the early 1990s) were called pen computers because that's how they were operated, with active pens. There was touch, too, but it was primarily of the resistive variety that worked best with a pointy stylus. That technology saw its heydays when phones were still dumb and people kept their address books and calendars first on Apple Newtons and then Palms and Pocket PCs.

When Microsoft became interested again in tablets around 2001/2002 (I say "again" because they'd been interested a decade earlier, but primarily to fend off pen-based rivals to Windows) they built the "Tablet PC" around active pen technology. It's called "active" technology because a sensor board behind the LCD detects the precise position of the tip of the pen even when the pen does not actually touch the glass. That's different from "passive" touch technology where a touch is only registered when a finger or stylus touches or depresses the LCD surface.

What are the inherent advantages and disadvantages of active versus passive?

First, active pens make "hovering" possible. That makes it possible for a cursor to follow the pen without actually registering a touch. This way, the user knows where the tablet sees the pen. That allows for very precise operation, just like it is with seeing the cursor when one operates a mouse. Second, active pens can be pressure sensitive. That can be used for 3D-like operation, and is invaluable for artists and designers. Third, active pens can have very high resolution, which makes them quick and very precise, something that's increasingly important on today's super-high resolution displays. On the negative side, active pen technology is fairly expensive. It can be inconvenient to have to first locate the pen before the tablet is operational. And if the pen gets lost, the device may become unusable.

And what about the pros and cons of passive touch technology?

The good thing is that conventional resistive touch doesn't need a special pen. Any cheap stylus will do, as will a fingernail and even firm finger touch. Resistive touch is also fairly precise as long as it's used with a stylus, and it's totally immune to rain or any wetness. For that reason alone, many rugged tablets and handheld computers have been using resistive touch for many years, and are still using it. But passive resistive touch has some significant disadvantages as well. Finger touch alone is very imprecise and unsuitable for operating small user interface components such as scrollers, check boxes and the like. Even when using a passive stylus, there's no cursor to tell you where exactly the touch will be registered. And there's the issue of "palm rejection," i.e. making sure that the device only reacts to the stylus and not only to inadvertent contact via the palm of the user's hand.

The above was roughly the status quo until Apple popularized projected capacitive multi-touch with the iPhone. Procap, or p-cap, as it's commonly referred to, is still passive touch. But it's a far more refined and much more elegant type of passive touch. Instead of pushing down hard enough to register a "touch," p-cap works via "mutual capacitance," i.e. the decrease in capacitance between a sensor electrode and a drive electrode when a finger gets close enough to affect (syphon off, really) the normal capacitance between a pair. This technology only requires a very soft touch, and it's quite precise once a user gets the hang of it. It's also quick because it's electronic rather than physical, and p-cap can easily recognize more than one touch at the time. Apple took advantage of all of the advantages to allow the effortless tapping, panning, pinching and zooming that not only made the iPhone a game changer, but also made p-cap the touch interface of choice for virtually all tablets and handhelds.

However, even the wonderful p-cap technology has its disadvantages. First, the subtle change in capacitance between two electrodes when a finger touches it requires a dry surface. Water, with its great conductivity, tricks the electrodes into false readings. Second, since p-cap also doesn't facilitate "hovering" and the finger touch area is fairly large, p-cap operation isn't nearly as precise as that with a mouse or an active pen. Neither of those advantages was severe enough to keep p-cap from becoming the success it is. They were, however, the primary reason why some tablets and even phablets became available with active pens. And that even though the late Steve Jobs was adamantly opposed to pens.

There is, unfortunately, no getting around the fact that legacy Windows doesn't work well with p-cap. One result of that recognition was Microsoft's bafflingly unfortunate Windows 8 that imposed not-ready-for-primetime touch functionality to all the hundreds of millions or billions using legacy Windows software on the job. Another was that, Jobs' decree notwithstanding, tens of millions bought Samsung's Galaxy Note tablets that combined p-cap with a little Wacom pen, adding precision when needed, and also a handy tool to jot and draw and doodle.

How did all of this affect the industrial and vertical mobile computing markets we cover here at RuggedPCReview? In a number of ways.

While p-cap totally took over on consumer smartphones, it took years for rugged handhelds to switch from resistive touch to p-cap. That's for two reasons.

First, Microsoft simply didn't provide an upgrade path to Windows Mobile, Microsoft's mini-OS that had dominated industrial handhelds for many years. The p-cap-friendly Windows PhoneOS was for consumers, and so Windows Mobile, although at some point renamed Windows Embedded Handheld, became a dead end. With the result that while smartphones charged ahead, vendors of industrial handhelds were stuck with an increasingly obsolete OS. In the consumer smartphone market, Android quickly filled the void left by Microsoft, but industrial and vertical market customers were, and still are, much slower to adopt Android.

Second, vertical market customers often do wear gloves and they often have to work in the rain or where it gets wet and p-cap doesn't work well. Between these two reasons, staying with resistive touch and a passive stylus made sense.

The situation, interestingly, was different with tablets. While the capacitive touch-based iPad was a runaway success that was followed two or three years later with equally successful Android tablets that also use p-cap, Android had a much harder time in industrial and vertical markets. There were a good number of attempts at industrial and enterprise Android tablets, and some saw modest success. But on tablets the pull and advantages of remaining part of the established Windows infrastructure were much stronger, and Windows tablets saw, and see, remarkable success. To the extent where not only the majority of vertical market tablet vendors continue to offer Windows tablets and introduce new ones, but where Microsoft itself is heavily investing into its own Surface tablet hardware.

Which, of course, gets us right back to Windows' weakness with pens. Microsoft's very own tablets use active pens in addition to p-cap, in essence admitting that even when using Windows 10, finger tapping alone just won't get the job done.

So we're sort of back to the same predicament pens had a couple of decades ago. Extra cost, easy to lose, proprietary. You can use any pen or pencil to write on any piece of paper, but a Wacom pen will only work with a Wacom-based tablet, an nTrig pen needs an nTrig tablet, and so on. And none of those proprietary pens could be used on a regular p-cap phone or tablet.

And this, finally, gets me to the Universal Stylus Initiative (USI). USI is a non-profit that was formed in early 2015 specifically with the goal of creating a standard that would allow any active pen to work on any p-cap smartphone, tablet or notebook.

On September 10, 2015, USI announced that their membership had grown to 31 companies, more than doubling from the initial launch in April 2015. Initial membership included such touch/pen heavyweights as Wacom, Atmel, Synaptics, eGalax-eMPIA, Hanvon Pentech, Waltop, as well as electronics industry giants Intel, Sharp, and Lenovo. By now, Asus and LG joined, as well as stylus providers (KYE Systems, Primax, Solid Year Co, Montblanc Simplo), and touch controller providers like Parade Technologies, Silicon Integrated Systems, Raydium Semiconductor and STMicroelectronics International.

This immediately brought up the question as to why any vendor of active pen systems would want to have any part of such a thing. After all, these are technologies heavily covered and protected by iron-clad patents and unassailable intellectual property. And the market for (rather expensive) replacement pens is quite profitable.

A visit to USI's website (see here) answered a few questions but not all, as the specification and most of the background technical information, of course, is only available to USI members.

I was fortunate that USI Chairman Pete Mueller made himself available to a much appreciated call. Pete, whose daytime job is that of principal engineer and senior technologist at Intel, explained that while there is indeed extensive intellectual property in active pen technology, the major players in the field have long seen the potential strategic advantage of providing a degree of interactivity. Talks between them are not unusual as making active pen technology more universally desirable will likely result in a much larger market. (Note that earlier in 2015 Wacom announced a "Universal Pen Framework")

Think about it: there are by now billions of p-cap smartphones and tablets without active pens. Given the increasing call for productivity-enhancing (i.e. creation, analysis, design, etc.) rather than activity-draining (i.e. news, video, silly cat tricks, games) smartphone and tablet technology, the availability of universally compatible pens might be a massive boon. Unlike today where the only option for tablet users are those awful fat-tipped passive pens that are hardly more precise than fingers, a universal active pen could open up entirely new functionality for little or no extra cost.

Atmel's blog quotes Jon Peddie of Jon Peddie Research as saying, "To date the market has been limited by proprietary touch controller-stylus solutions, which limits OEM choices and cost reductions. With the USI specification released, we expect that the capacitive active stylus market will grow from 100 million units in 2015 to 300 million units in 2018, opening up new markets such as smartphones and all-in-one PCs (see quote here).

How does USI intend to make that possible? In their words, "the USI standard defines the communication method by which the stylus sends data about the stylus operation to the smart phone, tablet or notebook PC. The data includes information such as pressure levels, button presses or eraser operation. In addition, USI technology makes use of the existing touch sensor (via a technology called Mutual Capacitance) in smart phones, tablets and notebook PCs, so that the added cost for enabling USI technology on these devices is zero or minimal."

But if we're talking active pens working with capacitive touch controllers, how could those p-cap controllers possibly work with active pens? Pete couldn't go into details on that because much is non-disclosure material, but the general idea that I got was that using a "USI-enabled" pen on a "USI-enabled" device would provide some, but not all of the full functionality of a particular active pen technology.

What does that mean? A look at original USI member Waltop's website provides some clues. It says that they provide both USI-enabled and vendor-specific styli, and that both types "satisfy the performance requirements of Windows 10, such as accuracy, linearity, latency, hovering height, etc. So presumably the USI standard seeks to cover all the mandated basics of using an active pen on a p-cap touch screen, but there are still special capabilities, extra functionality and perhaps higher performance only available through the full proprietary active pen technology. To use one of my beloved automotive analogies, USI provides the road system that allows any street-certified vehicle to get from point A to point B, but if someone has special desires and requirements, they will still want to get a Lexus or a Porsche.

However, Atmel's blog says that Through the same sensor that one’s finger uses to command a device, the stylus communicates via different frequencies to perform the action of writing — writing with up to 2048 different levels of pressure to give the pen-on-paper experience and render thinner or thicker lines in note-taking, painting and doodling, just like an ink pen." That sounds more like some of the proprietary functionality of an active pen system is being brought over into USI-spec passive p-cap controllers.

Poking around the web a bit, it seems like USI systems will be able to differentiate between a USI mode and a proprietary mode, or even switch between the two, depending on which seems appropriate. USI pens apparently will be AAAA battery-powered allowing a slender size and a projected 12-month battery life.

As is, USI hopes to have version 1.0 of their specification done by the end of 2015, and after that we should start seeing active pens that work on any p-cap device that's also compliant with the USI spec. It should be interesting to see what will become available, how well it works, and whether the initiative will take. -- Conrad H. Blickenstorfer, September 2015

Related:

  • Website of the Universal Stylus Initiative
  • The Universal Stylus is Coming (Intel Free Press)
  • Universal stylus to bring easy digital inking to tablets (TechRadar)
  • Early example of a USI pen: Baton US-70

    Posted by conradb212 at 05:47 PM | Comments (0)

    August 27, 2015

    Replacing the Atom N2600

    This morning I received an email from German touchscreen device developer and manufacturer faytech. The company prides itself in its ability to design and engineer high-quality products in Germany, then have them manufactured cost-efficiently in Asia, while providing local service.

    This email, though, wasn't about their latest touchscreen products. It was about the processors they use in their touchscreen panels and PCs. Specifically, it was about replacing the ubiquitous Intel Atom N2600 with the newer Intel Celeron N2807 and J1900. Faytech seemed especially taken with the N2807, which the company has chosen as the new standard for their resistive touchscreen portfolio. They said, "replacing the predecessor Intel Atom N2600, the new processor has it all: speed, stability, lower power consumption, and much better performance per watt ratio." The Celeron J1900, by the way, will be the new go-to chip for faytech's capacitive touch devices.

    That caught my attention. Are the N2807 and J1900 Celerons really the N2600's successor? And if so, why? As is, Intel is making so many different processors and has so many different classifications for them that even those following the industry closely often can't tell them apart or explain why one processor should be chosen over another.

    First, why does the N2600 need replacement? True, the "Cedarview" Atom N2600 was launched well over three years ago, an eternity in Intel's rapid-fire chip development cycle. But it turned out to be an exceptionally good chip.

    A third generation descendent of the Atom N270 that powered tens of millions of netbooks, the N2600 and its slightly faster N2800 sibling were the first Atom chips to use 32nm process technology instead of the older 45nm, making for smaller, more efficient packages. Cedarview processors were dual-core systems whereas before only desktop-oriented Atom versions had two cores. Graphics performance benefitted from a different design and much faster clock speed, resulting in Intel claims of 2X graphics performance compared to the second generation Atoms. And integrated hardware-accelerated video decoding enabled smooth full HD (up to 1080p) video playback, something that was not possible with earlier Atom chips.

    Word on the N2600's qualities got around, and a lot of manufacturers that had been burned by the poky performance of most original Atom chips switched to the N2600. When RuggedPCReview benchmarked a N2600-updated Handheld Algiz 7, we found an overall 3X performance improvement over the earlier Atom Z530-based version. Another example was Motion Computing's upgrade from an Atom Z670 in its original CL900 tablet to the N2600 in its CL910 successor. Again, the performance improvement was substantial (in the 70% range).

    We praised the N2600 as "probably the best general purpose Atom chip so far." N2600 performance was so good that some manufacturers who later upgraded to some of the lower-end Intel "Bay Trail" chips were in for a harsh surprise. For example, RuggedPCReview found a "Bay Trail" E3825-based tablet no faster than its N2600-powered predecessor.

    But that was in 2013, and it's now Fall 2015. The N2600's reign seems to come to an end, and Intel's "Bay Trail" platform is the reason.

    Bay Trail represents a departure from Intel's product strategy of the past few years that differentiated between low end (Atom) and high end (Core) processors. Instead, Bay Trail consists of a large family of single, dual, and quad processor chips optimized for various types of devices. Lower end Bay Trail processors use Intel's "Atom" brand, whereas higher end versions targeting tablets, notebooks and desktops carry Intel's "Celeron" and even "Pentium" brand names.

    Further, for the first time, an Intel Atom microprocessor architecture is paired with genuine Intel graphics. The graphics cores integrated into Bay Trail systems are of the same HD 4000 architecture and variety as those used in Intel's 3rd generation "Ivy Bridge" processors, albeit with fewer execution units (four instead of several times that number) and lower clock speeds. The new graphics support most of the same APIs and features, including DirectX 11, OpenGL 3.x (and even 4.0 in some cases), and OpenCL 1.2. Better yet, some clever power-saving features from 4th generation "Haswell" Core processors seemed to be included as well.

    So it's no surprise that Bay Trail has been a resounding hit. By and large, the impression on the street is that "Bay Trail" is much faster than all those old-style Intel Atom chips, fast enough to do some actual general purpose work, even tough assignments that just may come along on any given day. That includes the kind that totally brought almost every older Atom system to its knees. And it all comes at a cost that's a lot lower than full Intel Core processors. From Intel's vantage point, Bay Trail cut down on the complaints about Atom performance while, despite all the improvements, still being quite a ways behind full Core processor performance and thus no threat for that lucrative Intel market.

    The only problem is that it further increased confusion about Intel's various product lines. Bay Trail chips, while all using an Atom CPU architecture, are sold as Atoms, Celerons and Pentiums. Aren't Celerons gutless entry-level loss-leaders and Pentiums some ancient brand from a distant past? Not anymore, apparently. Or at least it's a different kind of entry level now. Figuring out the difference between all those Bay Trail chips isn't easy. And to make matters more confusing yet, some new Celerons aren't Bay Trail chips at all; they are Intel "Haswell" Core processors (like the Celeron 2980U).

    So what about the Celeron N2807 and J1900 that the good folks at faytech chose to replace the N2600 as the standard in their touch PCs and panels? Let's take a look at the two chips.

    Both of them are based on 22nm lithography instead of the older N2600's 32nm, both use the FCBGA1170 socket, and both use low-power DDR3L RAM. But that's where the similarity stops.

    The J1900, which Intel lists as an embedded desktop processor, is a quad-core chip with 2MB of L2 cache, running at a base frequency of 2GHz and a maximum burst frequency of 2.42GHz. Its thermal design power is 10 watts, it can support up to 8GB of RAM, its base graphics frequency is 688MHz with a top speed of 854MHz.

    The N2807 is listed as an embedded mobile processor. It is a dual-core chip with 1MB of L2 cache, running at a base frequency of 1.58GHz and a maximum burst frequency of 2.16GHz. Its thermal design power is 4.3 watts, it can support up to 4GB of RAM, its base graphics frequency is 313MHz with a top speed of 750MHz.

    For a more detailed spec comparison of the N2600, N2807 and J1900, check this page.

    In terms of raw performance, the J1900 would seem to have a clear advantage over the N2807, even though Intel actually lists the recommended price of the N2807 as higher than that of the J1900 (US$107 vs. US$82). Why is the slower N2807 more expensive? Likely because as a "mobile" chip it includes additional low power modes. It also includes a number of special Intel technologies that the J1900 doesn't have (like Intel Secure Key, Intel Idle States, Intel Smart Connect, and Intel Wireless Display). Unfortunately, even Intel's spec sheets only present an incomplete picture as the sheets are inconsistent. The technically minded will find some more info in the very technical Features and specifications for the Intel® Pentium and Celeron Processor N- and J- Series document.

    What about real world performance? Here we can present some hard data.

    According to RuggedPCReview's database, N2600-based devices scored an average of 476 in the PassMark 6.1 CPU Mark test, 433 in the overall PassMark suite, and 56,070 in the CrystalMark suite. J1900-based devices scored an average of 2,068 in the PassMark 6.1 CPU Mark test, 974 in the overall PassMark suite, and 117,000 in the CrystalMark suite. The sole N2807-based device scored 782 in the PassMark 6.1 CPU Mark test, 570 in the overall PassMark suite, and 83,800 in the CrystalMark suite. Based on these numbers, one might expect a N2807-based system to offer a roughly 1.5X overall performance increase over a N2600-based device, and a J1900-based system a roughly 2.2X overall performance increase over a N2600-based device. And a J1900 system might offer roughly 1.5X overall performance over a N2807-based device.

    So is replacing the venerable N2600 with either one of those Bay Trail chips a good idea? Yes. Time stands still for no one, not even for a good chip like the Atom N2600 of just three or so years ago. But we also believe that given the overall breathtaking pace of progress in the CPU arena, replacements should provide a clear and very noticeable best in performance, and not just an incremental one. So our money would be more on the Celeron J1900 which seems to have all the goods to be a solid, remarkably speedy, yet still economical go-to chip for any number of industrial computing projects where absolutely minimal power consumption is not the highest priority.

    There is, of course, a large number of other processor options from Intel and from other sources. But the x86 world likes standards and things to rely on, and so we've historically seen a flocking to a small number of chips that offer a good overall balance. Chips like the Celeron N2807 and J1900.

    Posted by conradb212 at 04:19 PM | Comments (0)

    April 17, 2015

    Xplore Technologies acquires Motion -- How it came about

    Today I listened to the full Investor Call broadcast Xplore held on April 16 about its acquisition of Motion Computing, and a lot of things are clearer now (listen to it here).

    Motion didn't exactly choose to be acquired, and this was not one of these situations where a big company comes along and makes a financial offer just too sweet to resist. What happened was that Motion found itself in a financial bind caused by third party issues over which Motion had little to no influence over. Specifically, the supplier of the displays used in their Motion C5 and F5 semi-rugged tablets shut down its plants in South Korea without any notice to speak of. This left Motion, which uses a built-to-order model, high and dry and unable to fill C5 and F5 tablet orders, with those two products combining to about half of Motion's sales. With half of its sales essentially on hold, Motion's financial situation quickly went from being quite stable to critical, until their main lender foreclosed.

    This requires some background information. Motion, like most US electronics vendors relies on Asian OEMs (Original Equipment Manufacturers) and ODMs (Original Design Manufacturers) to make its products. There are various nuances to such agreements, but suffice it to say that those OEMs and ODMs likewise rely on their vendors to supply parts. One such part was screens from a company called Hydis. Unfortunately, Hydis' parent company, Taiwanese E Ink saw fit to close two of the Hydis LCD manufacturing plants in South Korea.

    Now one might assume it should be easy to source replacement screens for tablet products that, while quite successful, were not produced in the millions or even hundreds of thousands. It obviously can be done, but it's not easy. There's locating a suitable replacement, there's business arrangements, there's adapting and configuring and testing, all of which takes time, time which isn't available to a company doing build-to-order. Components are changed and re-sourced all the time, but always with proper notice and proper lead time. Apparently, that wasn't the case with E Ink's shutdown of the Hydis plants.

    A bit more about Hydis. They make screens that we here at RuggedPCReview have long considered the very best. Originally launched by Korean Hyundai as Hyundai Display Technology and then part of Hyundai's Hynis Semiconductor, Hydis started working on a wide-viewing angle technology called "fringe field switching" (FFS) in 1996. That came in response to Hitachi launching the IPS display technology, which also provides superior viewing angles. Hydis was so convinced of the merits of their FFS technology that they decided to pretty much bet the farm on FFS. That was understandable as FFS not only provided 180 degree viewing angles from all directions, but also offered terrific contrast ratio, none of the dreaded color shifts that lesser LCDs to this day display when viewed from certain angles, lower power consumption than IPS, and also no "pooling" upon touch.

    Hydis was spun off from Hyundai completely in 2001, just when Microsoft's Tablet PC effort got underway, and Hydis saw a big opportunity to be the dominant player in tablets. I recall that at Pen Computing Magazine we were blown away when we saw Hydis FFS displays in Sharp's Actius TN10W and HP/Compaq's TC1100 notebook convertibles in 2003. The Hydis displays were so much better than anything else that there simply was no comparison.

    Just when things appeared to look bright for Hydis, Hynis sold them off to Chinese LCD manufacturer BOE, and the company became BOE Hydis. Between the Tablet PC never living up to expectations and other issues, BOES Hydis didn't do well and was acquired by Taiwan's Prime View International (PVI) which eventually, in 2010, became E Ink, the folks who pretty much had cornered the market on those paper-like eBook reader displays used by the Kindle and also by Sony. PVI had actually managed to nurture Hydis back to financial health, but did so primarily by selling and licensing Hydis FFS patents. This led to Korean protests that after BEO, E Ink was also simply "asset-stripping" Hydis and thus Korean intellectual accomplishment. Add to that the fact that E Ink fell on hard times itself after the eBook market was severely impacted by the iPad and iPad-class tablets, and it's no surprise that they didn't have the resources to properly fund Hydis.

    That said, and much to Hydis' credit, the company did not rest on its FFS laurels. First came an improved FFS, and then, in 2007, AFFS+, which perfected the technology in numerous respects, including even better outdoor viewability.

    Motion, always on top of technological advances, was an early adopter of Hydis displays, giving them an edge over competition that used lesser LCDs not nearly as well suited for tablets where the ability to view the image on the display from any conceivable angle matters much more than in laptops. The superior quality of the Hydis AFFS+ displays used in Motion's C5 and F5 tablets contributed to their wide acceptance, and continued to do so even in the latest generation of the platform, launched in February 2015.

    Unfortunately, February 2015 was also the month where E Ink suddenly shut two Hydis plants in South Korea. The stated reason were "chronic losses" and "high manufacturing costs." That didn't sit well with the Koreans who felt that E Ink had let the plants become obsolete, on top of simply mining Hydis for its patents. The bottomline for Motion was that they had a very promising new generation of their C5/F5 platform based on Intel's latest 5th generation "Broadwell" chips, and no screens.

    No product, no sale, and the rest is history.

    Enter Xplore. Located within ten miles from one another, the two companies knew each other well. Some of the workforce had worked in both companies, and both certainly also benefitted from the presence of Dell, which although not having any deep business relationships with either Motion or Xplore, made for the presence in Austin of a steady pool of highly qualified technologists.

    But if Hydis displays were so good, and especially well suited for tablets, didn't Xplore use them as well? They did. Xplore, too, was an early Hydis adopter, and that move may well have been what helped Xplore survive and eventually thrive while some of its direct competition did not. Xplore's high-end, ultra-rugged iX104 XC6 tablet has a Hydis screen. So didn't the Hydis shutdown affect Xplore as well? It did, but Xplore had inventory and did not entirely rely on build-to-order. And while Hydis certainly has an impact on Xplore as well, their financial situation was different from Motion's and they were able to not only absorb the blow, but also turn it into an opportunity by taking over a very complementary Motion Computing.

    If there's ever been a better example of making lemonade from lemons, there haven't been many. Had Xplore not been there just ten miles away and a perfectly logical rescuer, Motion would have been liquidated and most likely totally ceased to exist. That would have been a massive blow to Motion's customers and employees, and also to the rugged computing industry in general. As is, Xplore says that "the vast majority" of Motion employees have been extended employment offers.

    So what can we expect from all this? As is, Xplore sales apparently peaked in 2012 with over $100 million, but still were in the 80 million range for 2014. Xplore is a roughly 40 million business. Xplore warns that this doesn't mean that they're suddenly a $140 million business, and that it'll take a year or two until everything has been integrated and ironed out. Xplore Chairman Philip Sassower likened the opportunity as being given the chance to pick up a mansion in distress in a $5 million neighborhood for three quarters of a million. It'll take work and perhaps half a million investment, but then it'll be a $5 million mansion again.

    And then there are the numbers. VDC estimates the 2015 market for rugged tablets as being worth about $585 million. And the 2015 market for rugged laptops about $1.2 billion. And that's on top of a massive number of consumer tablets, a portion of which go into the enterprise that would just love to have some more durable gear. So there's plenty of upside. Xplore is already working on getting suitable replacements for the Hydis screens. And Sassower wants for Xplore to be the #1 in rugged tablets. -- Conrad H. Blickenstorfer, April 17, 2015

    Posted by conradb212 at 07:46 PM | Comments (0)

    Xplore acquires Motion -- what it means

    On April 16, 2015, Xplore Technologies and Motion Computing announced that Xplore was acquiring Motion. This was not a total surprise as both companies are in the rugged tablet computer market, both are pioneers in tablets, and both are located within ten miles from each other in Austin, Texas.

    And yet, the announcement came as a surprise to me. When I had interviewed Motion CEO Peter Poulin in February of this year, Poulin had ended with saying "Motion is in a good position. According to VDC, Motion is the #2 player in rugged tablets, more than twice as large as #3," and he followed up with saying the company had just totally revamped all of their platforms for much greater performance and enhanced wireless communication and ruggedness. And that they had other products in the pipeline. "We're quite optimistic," Poulin concluded. And yet, just a couple of months later, Motion was acquired.

    The move was also a surprise because both Xplore and Motion have shown remarkable resilience from setbacks and challenges throughout their existence. In an era where numerous rugged computing gear manufacturers either folded or were absorbed by either Motorola or Honeywell, Xplore and Motion persevered and remained independent.

    As a privately held company, Motion's business fortunes were closely guarded, but the company's savvy, skills and determination were apparent throughout its history, starting with the unenviable task of taking Microsoft's flawed 2001/2002 Tablet PC initiative and running with it. Though highly publicized and initially supported by most major PC manufacturers, the Tablet PC didn't find widespread acceptance due to high costs and technology that just wasn't quite ready yet. Yet, Motion toughed it out and established for itself a nice niche in enterprise and vertical markets.

    Both Xplore and Motion were especially skillful in recognizing valuable technology advancements early on, and quickly making them available to their customers. Both companies were pioneers in such productivity-enhancing features as dual input where pen and touch worked in harmonious unison, superior outdoor-viewable displays, ergonomics suitable for actual tasks at hand, and the ability of their products to not only hold up in challenging daily use, but also perform at full speed under any operating conditions.

    On the Motion side, the company's early adoption of Intel's Mobile Clinical Assistant (MCA) platform was an impressive example of their unerring compass of what worked and made sense, and what didn't. Motion's C5 MCA -- with its square layout, integrated carry handle, and peripherals placed in the exact right spots -- became a big success, so much so that Motion added an F5 version of the platform for general enterprise and industrial use. Most impressively, while a good dozen other companies also introduced Intel MCA-based tablets, most quickly abandoned them again, lacking Motion's razor-sharp focus on their markets and tablet products.

    Fellow Austin resident Xplore impressed through sheer determination. Time and time again Xplore found new investment as the company's leadership tirelessly presented its case. Which wasn't always easy with what for a long time essentially was a one platform product lineup.

    I well recall first seeing them at a Comdex trade show in Las Vegas in the late 1990s where they had a large, terrific display, a convincing message, and jaw-dropping prototypes that, however, were not quite final yet. That was quite common back in those days, and most of the attempts led nowhere. But Xplore was back the next year, and the year after that.

    When we published the Pen Computing print magazine and did annual Editor's Choice and best product awards, Xplore scored with its impressive GeneSys Maximus. I remember calling Xplore with the good news, and they were disappointed that it was the GeneSys that got the recognition, and not their then semi-secret brand-new iX104. Little did we know that that machine was to become the core engine of Xplore's success and future.

    So why Xplore and Motion got together now, after all those years, I don't know. Business imperatives, I assume, and I am sure it makes perfect sense. But what does it mean looking forward, especially in the light of many such acquisitions that did not work out for the best? In the past we've seen large companies almost mindlessly snapping up much smaller ones. That's not the case here. We've seen fierce competition where one competitor eventually came out on top and annihilated the other. That's not the case here either. So let's see what Xplore and Motion bring to the table.

    Historically, Xplore has been tending to the ultra-rugged tablet market whereas Motion concentrated on a variety of vertical markets that required durable, specially designed and configured tablets. Motion does not have anything that competes with the various versions of Xplore's ultra-rugged iX104 tablets (see here). Xplore doesn't have anything like Motion's C5 and F5 semi-rugged tablets with their integrated handles. Xplore also doesn't have anything like Motion's R12 tablet with its big 12.5-inch screen (see here). So there's no overlap there. And Motion doesn't have anything Android-based, whereas Xplore has its modern, innovative RangerX tablet.

    There is a degree of overlap in just one area, and that's in the promising and potentially quite lucrative area of compact lightweight Windows tablets. That's the tablets for users who do need Windows, but want it in a trendy, sleek and attractive iPad-like design that's tough enough to hold up on the job. For that Xplore has their Bobcat (see here) and Motion has its CL920 (see here). These two, though, are also different enough to be able to co-exist in the short term, the CL920 the unified company's enterprise market tablet entry and the Bobcat for tougher assignments that require more armature and a higher level of sealing.

    Most importantly, there is very little existing customer overlap with these two companies. Xplore has traditionally concentrated on oil & gas, military, government, heavy industry, and similar, whereas Motion is primarily active in healthcare, retail, construction, field service, and so on. If Xplore plays its cards right, it can emerge as a much larger company with a much broader reach, and also perhaps as an example of where 1 + 1, for once, adds up to more than 2. I've said it ever since the consumer tablet boom began, and I'll say it again: with the tablet form factor fully accepted virtually everywhere, there's tremendous opportunity for rugged equipment vendors to step in and successfully provide this desired and contemporary form factor in products that do not break on the job and in the field.

    Overall, this development may also be good news for other independents in the rugged tablet market, companies like Getac, GammaTech, MobileDemand, the Handheld Group, and others: resistance is not futile. Keeping it in the family and preserving the unique, special expertise of the rugged computing industry may well be the best way to success and prosperity.

    -- Conrad H. Blickenstorfer, 4/16/2015

    Posted by conradb212 at 12:05 AM | Comments (0)

    February 10, 2015

    Conversation with Peter Poulin, CEO Motion Computing

    On February 5th I had a chance to speak with Peter Poulin, who was appointed Motion Computing's CEO on December 11, 2014. An industry veteran with more than 25 years of sales, marketing and general management experience in the public and private sectors, the company's press release said Poulin's goal will be to capitalize on the company’s deep mobility expertise and aggressive investments in the design and development of ruggedized tablet platforms and integrated mobility solutions to expand its reach within target vertical markets.

    Over the years, I've been covering Motion's vertical market tablet lineup in some detail, going back to a meeting in San Francisco in 2001 where Motion CEO, Scott Eckert, and founder, David Altounian showed me the prototype of their first tablet. Motion was a startup then, formed to be part of Microsoft's Tablet PC initiative.

    While the overall Tablet PC project was not as successful as Microsoft had hoped, and it would be almost another decade before the iPad finally put tablets on the map, Motion established itself as a provider of enterprise and ruggedized tablets in various niche markets. Motion succeeded where many others failed with their early tablet efforts due to focusing on tablets and tablets alone (Microsoft itself had flip-flopped during the Tablet PC gestation period, switching emphasis from pure tablets to convertible notebooks), and also by displaying an unerring ability to recognize new trends and technologies at an early stage and making them available to their customers.

    One look at Poulin's resume shows that he's uniquely qualified for the job as Motion's CEO. An electrical engineer with a degree from Cornell, Poulin worked for Compaq for 13 years in sales, marketing and management. He then broadened his expertise and horizons with sales and business development stints at such diverse technology companies as Segway, NetBotz, APC, internet solutions providers Hoovers and Virtual Bridges. Poulin joined Motion Computing in July 2012 as VP of marketing and then ascended to CEO.

    Here are some of the highlights of our discussion:

    RuggedPCReview: Microsoft's Tablet PC initiative wasn't a great success and most early tablet providers exited the market within just a couple of years. Except Motion. What made Motion successful with early Microsoft-based tablets where others failed?

    Poulin: The answer is, it's not just about the tablet. It’s really about understanding the customers’ workflow, and integrating the technologies that enable that workflow, of which the tablet is one component. Motion decided early on to focus on a limited number of verticals. In the process we gained a great amount of expertise on how customers use technology. I believe what differentiates Motion is that we have very purpose-built devices that truly make the job easier. An example is the unique keyboard pairing we use with our R12 tablet. It's super-easy and there's none of the frustration users often have with Bluetooth pairing sequences. We know how field service workers work, we know how to build docks that work for them, peripherals that work for them, features that they need. Yes, we seek to grow as a company, but we are careful not to lose that depth and connection to our customers and spread ourselves too thin.

    RuggedPCReview: Ever since the introduction of the iPad, tablets have become a huge success. But the success is primarily in consumer markets and to some extent in the enterprise. Why is that?

    Poulin: We see the tablet as a very successful enterprise tool, and we have the mass consumer adoption of the tablet to thank. However, the consumer and the enterprise have very different needs. For many enterprises and vertical market businesses it's a matter of how to reduce deployment risks. They want to know how they can protect their investment. They need to leverage existing investment. They need to reduce downtime. They need to focus on user work flows. And they know that if their users don't embrace a technology it just won't work. One of our customers, Thames Water, engaged in extensive user testing (200 users) before making a decision. Our Motion F5 was chosen as the clear favorite in user testing with 86% preferring the device over two competing alternatives. Our tablets are replacing a fleet of legacy handhelds to run Thames Water’s SAP and ClickSoftware asset and field management systems. User testing and user acceptance were key elements in Thames decision to choose Motion.

    RuggedPCReview: Over the years, Motion has generally been a pioneer in quickly making new and better technologies available to users. Examples are superior displays, input technologies, the latest processors, new form factors, etc. Is this art of Motion's corporate culture?

    Poulin: Motion has always had a lot of excellent tech people. We have the discipline of big corporation experience, complemented by the agility of startup experience, and that helps us moving fast, being first, being innovative. This has undoubtedly shaped Motion's culture. But I believe we also have a great balance between technical and customer experience. While the tech innovations are most visible, we're also constantly working on details, peripherals, modules, and how to easily make them part of our tablets. That takes a lot of risk out of integration, and our customers appreciate that.

    RuggedPCReview: We currently have this interesting situation where Apple and Android-based tablets almost completely dominate the consumer markets, whereas Microsoft remains strong in tablets designed for enterprise and vertical markets. For now, all of Motion's tablets use Windows. How do you see the Windows versus Android situation?

    Poulin: We watch that situation very, very carefully. I think one difference between consumer and vertical markets is that on the vertical side it all depends on application software, the kind that does all the heavy-duty lifting, and almost all of that runs on Microsoft. Are verticals adopting Android? Yes, to some extent. Some of our customers are trying Android with very narrow apps for certain very specific tasks. The challenge for Android comes with heavier duty apps, development and maintenance cost, and the fact that, for now at least, Android changes so very quickly and older versions are no longer supported. For IT organizations, that cadence of change is painful.

    RuggedPCReview: Microsoft is putting a heavier emphasis on cloud services. Where do you stand on that?

    Poulin: Given the ubiquity and ever-increasing performance and reliability of broadband connections, Motion is paying a lot of attention to cloud-based applications and services. Along with that, security is becoming an ever-greater concern, both in the cloud and also with broadly distributed devices. Motion has long considered security as an integral part of our products and services with TPM, Computrace, multi-factor authentication, etc. In our newly-released F5m and C5m tablets, we're stepping security up by another level with self-encrypting drives.

    RuggedPCReview: While Microsoft certainly still has a huge edge in enterprise and vertical market deployments, there are also challenges as Microsoft attempts to integrate mobile markets into its OS strategy.

    Poulin: Yes, there's certainly a challenge with Windows 8 and 8.1, but overall they're getting bashed a bit too much. Microsoft hasn't done bad, and things are only getting better now. Microsoft is just so eager to get it right that perhaps they moved to catering to consumers a bit too fast, and that can be very disruptive to the enterprise. Then there are the migration issues. Windows 7, Windows 8, Windows 8.1, and soon Windows 10, and they need to support everything. It's not easy to make an OS attractive to consumers as well as corporate customers.

    RuggedPCReview: On the hardware side, Intel has been charging ahead at a very rapid pace with successive generations of Core processors. How difficult and important is it to keep up with Intel?

    Poulin: It's not that complicated on the high end, because the performance levels are there, and have been there for a while. Motion customers do not always want such a rapid pace, so sometimes they skip a generation, and sometimes it's tempting to skip two. It's not so complicated at the low end where it took Intel a while to get up to speed with the Atom platform. That was a bit tough for a while, but they're now sorting that out, and Motion is very confident in the range and predictability of Intel’s product roadmap.

    RuggedPCReview: We can't help but notice that Austin, Texas, seems to be a hotbed for tech development and rugged systems. Dell is there, of course, and Motion, and also Xplore. What makes Austin special?

    Poulin: There's lots of talent in the Austin area. There are lots of big companies and also a vibrant startup community. Somehow it all came together.

    RuggedPCReview: Where, overall, does Motion stand now, and what are the plans for 2015 and beyond?

    Poulin: Motion is in a good position. According to VDC, Motion is the #2 player in rugged tablets, more than twice as large as #3. And we've just totally revamped all of our platforms, the CL-Series, the C5 and F5 Series, and the R12. All have much greater performance, and we also enhanced wireless communication and ruggedness. And we have other products in the pipeline. So we're quite optimistic.

    -- Conrad H. Blickenstorfer, Editor-in-Chief, RuggedPCReview

    Posted by conradb212 at 04:34 PM | Comments (0)

    December 21, 2014

    Storage wars

    Anyone who's ever watched the "Storage Wars" reality TV series on the A&E Network knows that with storage, you never know what you're going to get. That's true for stuff people stow away in storage units, and it's also increasingly true with the kind of storage in our electronic devices.

    There was a time when RAM was RAM and disk was disk, and for the most part the only rule was that more was better. But that was in the era when you could count the total number of available Intel processors on the fingers of a hand or two instead of the roughly one thousand they offer now.

    These days things are much more complicated. And just as often no one seems to quite know for sure why a certain processor was chosen for a particular product, there aren't easy answers why a product uses this type of storage versus that. There often appears to be a disconnected between the engineers that make those often arcane decisions and the marketing side of things that must explain what it's for and why it's better.

    Here are a couple of examples we've recently come across:

    DDR3L vs. LPDDR3 — In mobile computing devices designed to draw as little power as possible you generally not only find "low power" processors (where low power refers to the low amount of electricity use, not the chip's performance), but also DDR3L random access memory. In the case of DDR3L, the L stands for a lower voltage version of the DDR3 memory standard, and more specifically 1.35 Volts instead of the standard 1.5 Volts. That kind of makes sense.

    However, you now increasingly also see devices with LPDDR3 RAM, especially in handhelds and tablets. LPDDR3, it seems, was created specifically for such devices, and their need to be able to go into standby modes where the memory uses as little power as possible. LPDDR3 runs on 1.2 Volts, is part of the main board, and generally uses only about a tenth as much power while in standby as does regular DDR3 RAM.

    Initially I thought LPDDR3 was a lower cost type of memory as we first saw it in lower cost units. But apparently its cost is actually higher than that of standard of even low power DDR3. And it's used in high-end devices such as the Apple MacBook Air.

    SSD vs. eMMC — We've long been seeing substantial differences in benchmark performance between different types of solid state storage. There's generally not much of a difference between the performance of different rotating media, a large difference between solid state storage and rotating media (solid state is much faster), and a very large difference between different types of solid state storage.

    Recently we observed an almost 10:1 difference between the SSD and eMMC mass storage performance in two tablets of roughly the same size, one low end and one higher end. SSD stands for Solid State Disk, eMMC for embedded Multi Media Card. The generic difference between the two, apart from a great variance of flash memory speed itself, is the sophistication and complexity of the controller and interface. eMMC have a basic controller and a relatively slow interface, whereas SSDs use complex controllers and one of the various SATA interfaces.

    There's always an inherent answer to any tech question, it's just that those answers aren't easy to come by. What I'd like to see more of is how something works, what its pros and cons are, and why it's used in certain products.

    Posted by conradb212 at 04:28 PM | Comments (0)

    November 04, 2014

    Intrinsically safe ecom Tab-Ex: another rugged tablet based on Samsung hardware

    This morning I saw in the news that at the IFA Berlin show, ecom instruments launched what the press release called the "world's first Zone 1/Div. 1 tablet computer". New tablets are launched all the time, but I quickly realized that this was relevant for two reasons:

    First, the Zone 1/Div. 1 designation means it's a tablet for use in hazardous locations. Zone 1, in the IEC/ATEX classification system that handles intrinsic safety issues, means the devices can safely be used in areas where there are flammable gasses are likely present. In essence, that requires that there's no chance that the device can generate sparks or other means that could lead to explosions. I'd need to look up what exactly Div. 1 refers to; there are two different entities handling this classifications, the North American National Electric Code (NEC) or the European ATEX directive.

    Intrinsically safe devices, i.e. devices that are incapable of igniting gasses, dust or vapors, are very important in certain deployments, and so this new em tablet will certainly attract attention.

    The second reason why this new em ecom Tab-Ex tablet is relevant is that it's another example of a stock consumer Samsung tablet inside a specially developed case. In May 2014 we took a detailed look at the N4 device from Two Technologies, which is a Samsung Galaxy Note II with a value-added rugged case that also included a second battery and a physical keyboard (see our review here). But whereas the Galaxy Note II is a 5.55-inch "phablet," the ecom tablet is based on the Samsung Galaxy Tab Active with a much larger 8-inch screen. And the Tab Active offers pretty impressive ruggedness specs even without any third party enhancements: It's IP67-sealed, it can handle four-foot drops, and its 400-nits backlight is bright enough to handle outdoor viewing. The Tab Active is a US$700 tablet and you can see its full specs here.

    ecom isn't hiding the fact that their Tab-Ex is based on a Samsung tablet. Even the press release openly states that this type of solution "provides compatibility and a wide range of preloaded applications for a safer work environment, including unparalleled security functions like device encryption, MDM, VPn and secure connectivity (Samsung knOX)." And, perhaps even more importantly, that "being able to offer the same internal tablet, Samsung GAlAXY tab Active, ecom provides a key benefit of consistency in product use - whether Rugged, Zone 2 / Div. 2, Div. 1 or Zone 1. And, the software applications you develop for the Samsung Galaxy Tab Active will work unchanged on the Tab-EX01."

    All of this makes the em Tab-Ex another interesting case in the current discussion on where the rugged computing industry should be moving to best take advantage of the worldwide popularity, and impressively high-tech, of consumer smartphones and tablets. As is, there are vehemently different opinions in the industry. Some feel that it makes perfect sense to pack readily available consumer technology into a value-added case whereas others feel that the guts of a rugged device had to be just as rugged, and the rugged market was inherently incompatible with the 6-month product/technology cycle in the consumer market.

    Below you can see the ecom TabX on the left, and the donor Samsung Tab Active on the right.

    And here are the ecom Tab-Ex product page and the Tab-Ex brochure.

    Posted by conradb212 at 03:20 PM | Comments (0)

  •