» Archive for February, 2009

Google Gaffe: Gmail Outage Shows Pitfalls of Online Services

Thursday, February 26th, 2009 by David Goldes

Google’s Gmail system was down for 2.5 hours earlier this week, the sixth such outage in the past eight months.  It isn’t unusual that an e-mail system crashes, but most such occurrences are limited to one organization.  When Gmail, a service Google touts to businesses as more reliable and easier to use than Microsoft Exchange and Lotus Notes/Domino, goes down, it makes headlines – as well it should.

Applications that exist “in the cloud,” such as Gmail and Salesforce.com, come with risks that are not readily apparent to many people, especially relatively unsophisticated users and managers in smaller organizations.  Gmail was first introduced in 2004; a business version of the offering was released in 2007.  Its pricing model, $50 per user per year, is very attractive to many organizations that lack the ability to manage their own IT infrastructure.  Yet outsourcing your e-mail, which essentially is what using Gmail amounts to, is far different than outsourcing other aspects of an operation, such as the company cafeteria.  Unless cooking is your core competency, there is no reason to keep that operation in house.  But e-mail is the lifeblood of almost every organization today; rather than pick up the phone, people send e-mail – and they expect that it’s received promptly on the other end.

Just imagine if all of the phone lines to your office failed – not today but ten years ago, when the telephone was the most important means of communication (along with fax, I should add).  That’s what Gmail’s users were facing on Monday.  The silence was deafening.

In addition, after five years and 30 million users, many of them corporate accounts, Google still considers this a beta product.  Apparently, based on the adoption rate, companies have had no compunction about using beta-ware for mission critical e-mail services.

Would a non-cloud based system perform better?  Perhaps not, but when it fails, not everyone would go down at once.

David M. Goldes is the president of Basex.

Defining Productivity for the Knowledge Age

Thursday, February 26th, 2009 by Jonathan Spira

Productivity is a term you may hear on a daily basis but have you ever stopped to consider its meaning, especially within the context of knowledge work and knowledge workers?  It probably isn’t what you think it is.

Promises of productivity increases frequently come from technology vendors in the course of promoting their offerings.  Few, if any, appear to be able to explain exactly what they are promising, leaving one to wonder if one might therefore type faster, have more meetings, hold more efficient meetings, or write more memos and e-mail messages?  In a more serious vein, however, this is a very serious question: what exactly do we mean when we use the “p” word?

In an industrial setting, defining productivity is simple; it’s how many widgets go flying out the factory door in a given period.  The American Heritage Dictionary of the English Language, Fourth Edition, defines productivity as “the rate at which goods or services are produced, especially output per unit of labor.”  The Bureau of Labor Statistics uses the term “output per man-hour” to indicate productivity.  When applied to knowledge work however, it seems that all bets are off.  So how exactly can one measure knowledge worker productivity in a quantitative fashion?

It took 150 years from the dawn of the industrial age until the beginnings of a management science began to  develop.  Unfortunately, there is little applicability of the industrial age’s management science to a knowledge economy setting.  Indeed, today we are in the very early stages of developing a management science for the knowledge economy and it will probably be decades before we fully understand even what questions have to be asked.  The wide range of tasks that knowledge workers undertake, combined with the fact that there are different levels of knowledge workers, ranging from those with a single skill to highly skilled workers who exercise independent thought and action most of the time, makes both the task of defining productivity and developing a management science somewhat tricky, to say the least.

We’ll continue to examine this topic in the coming weeks and months.  In the meantime, if you have any thoughts, suggestions, or comments, please share them here.

Jonathan B. Spira is the CEO and Chief Analyst at Basex.

Why Internet Commerce Won’t Work (until we fix it)

Thursday, February 19th, 2009 by Jonathan Spira

I’m a big believer in the potential of the Web. I prefer to do my shopping, research, sourcing, etc. on the Web and conduct a dialogue via e-mail.

However, there are various impediments which have been placed in my (and other users’) electronic path. Until it is recognized that having a Web site that says “place your order here” does not constitute electronic on-line commerce, it is unlikely that the Web will become transaction oriented.

Too often, the rules that would normally apply to starting a new business unit are cast by the wayside. These rules include the basic tenets of doing business, i.e. why customers would chose to shop at your ‘store’ rather than take their business elsewhere. For years, we have railed against those organizations which created Web sites in total isolation from the underlying core business. With electronic commerce, these issues take on an increased importance.

I have prepared a simplified list of these common-sense rules which are generally and flagrantly violated at most on-line sites that I have seen.

1.) Make shopping convenient. Customers don’t go into real stores for inconvenience. If finding something is harder on line than in a mortar-and-bricks store, or takes longer (including driving time to and from), then customers will take their business elsewhere.

2.) Make shopping friendly. Customers don’t like stores with impertinent clerks. Web sites with impertinent pages are equally as bad. Test things before you go on line. Forewarned is forearmed.

3.) Be responsive to your customers. On-line customers expect prompt e-mail replies to queries. Several hours is good. Next day (early) is tolerable. No response is deadly.  Even if you don’t sell products directly, you have an obligation to ensure that you answer your customers’ queries.

4.) Provide a reason to go on line. Access to a huge selection that couldn’t be inventoried in every mall is a
primary raison d’être for electronic commerce. Don’t offer outtakes. Customers will go elsewhere.

5.) Don’t put barriers in the way. Making customers answer a few questions where the answers are readily hand is a good way to get to know your customers. Putting non-user-selectable login names and passwords, or asking  for information that isn’t easily available, is akin to having your store hours from midnight to 8 a.m. only.

6.) Imagine the shopping experience from the shopper’s perspective. Make certain you test the site using typical customers. See how they react and if they enjoy the experience. Look for the on-line customer to test the limits of the site.

7.) Most importantly, take advantage of the medium. Give your shoppers something they cannot get off line, such as forums, articles and reviews on products, an opportunity to see works-in-progress. Deliver information to your customers even when they’re not at your site (for example, using the “push” model of information delivery).

8.) Don’t leave anything to chance. And don’t outsource the entire project. If on-line commerce will be rucial to your company’s strategy, integrate it into the strategy and manage it like any other busines unit.

My own experiences speak to this issue.

I am still waiting (from mid-December 1996) for 3M to respond to my e-mail and follow up e-mail several days later enquiring after the availability of certain Post-It Notes.  Hewlett-Packard (also from mid-December) never answered my enquiry about obtaining service for a certain HP-brand server. On a more positive note, The New York Times has an e-mail agent that sends responses to e-mails about their Web site. Several days later, a response from a human usually follows.

Warn the customer of limitations.
I purchased tickets on Delta’s new on-line Reservations Desk. Unfortunately, the site never warned me that my group discount (5%) could not be used on the Web. I spoke to several sympathetic people at their various phone centers.  One advised me that there was a special telephone number
for questions relating to reservations from the Web site, which she herself didn’t know, but which was definitely listed on the Web site, “somewhere.” It isn’t. The Reservations Desk itself is a great concept. However, there are no warnings as to its limitations (no group discounts, for example).

I tried purchasing some clothes on line (I don’t remember the site name). I wanted to order two of one type of shorts, and a shirt. The very rigidly-designed order form would only let me order one of each. I declined.

No response in any medium:
Customer service should be endemic. It should not just occur in one area of the company. From an experience trying to purchase a disk drive, I learned that bad Web service might be symptomatic of the entire organization. I went to the Web site of a large computer distributor, where I had my very own customer number and everything (and with whom I had been dealing for at least the past 10 years). They recently launched their Web site and it looked promising.

First thing was to gain admission to the site. I had not yet used the site, so I entered my customer number as
requested and waited. What I received was a message saying that this number was not valid for this purpose, and that I could call a toll-free number to speak to a customer service representative. The message further advised that someone would call me tomorrow (I was doing this on a Sunday) in any case.

I called the telephone number. I then waited on hold for approximately twenty-five minutes (it’s not that I’m patient, but I have a good speakerphone). At minute twenty-five, the system hung up on me!

The promised telephone call never came, and I actually forgot about the fact that I still needed to purchase this disk drive until Wednesday. So I tried the Web site again, entering the requested information including my very valid customer number (which I had just used in an old-fashioned, analog transaction a few weeks earlier). The results were the same. Down to the very fact that I called the toll-free telephone number, it hung up on me after about twelve minutes, and the phone call that the screen message promised never did arrive.

On Thursday, I resolved to try a different tack. I called the company’s toll-free order line, entered my ustomer number as requested, and was transferred to a representative’s voice mail. I left a message, indicating which hard disk drive I was interested in, with my phone number, and waited. Although
the representative’s voice mail promised me (in her own voice) a prompt return call, that never occurred.

To those who might say that the company was just having a bad day, I say balderdash. If the site was not ready for commerce, it should not have been sitting there at the distributor’s URL. Most stores don’t open their doors to customers when the shelves aren’t stocked or the clerks aren’t trained. That goes for the Net as well.

There is no excuse for not testing your site thoroughly. The types of problems I encountered simply should not have happened. All they will do is turn consumers off from the technology. And since the company who launched this site was a leading supplier of technology products, it is even more heinous a crime.

Epilogue: my brother purchased the hard disk drive at one of the discount chains. They didn’t ask for a special account number; they didn’t promise a call back the next day. They did have the merchandise in stock and the ability to hand it over, in a shopping bag, using an old-fashioned plastic credit card to validate payment.

Jonathan B. Spira is the CEO and Chief Analyst at Basex.  This article originally appeared in the Basex Online Journal of Industry and Commerce (BOJIC).

Information Overload in Government: $31 Billion Spent Managing Information

Thursday, February 19th, 2009 by David Goldes

If you’ve ever wondered what the typical government worker does in the course of his workday, it’s a good chance he spends a lot of time filing, deleting, or sorting paper and/or digital information.  According to research released today by Xerox and Basex, based on a survey conducted by Xerox and Harris Interactive, 58% of surveyed U.S. government and education workers spend nearly half of the typical workday doing just that.  Our research found that the effort to manage information costs local, state, and federal governments a minimum of $31 billion per year.

Today, with cutbacks in services looming if not already in place, tackling the problem of information overload is a good place to start eliminating some of these costs.  Taking such steps will speed up work processes, reduce stress levels, and save time and money.

The survey itself was quite revealing.  57% of those surveyed said that not finding the right information was more frustrating than being stuck in a traffic jam.  38% said that they had to redo reports or other work as a result.  24% said they later discovered they had used the wrong information in preparing their work, and 37% agreed that their organizations are drowning in paper (yes, paper: 50% of the processes of those surveyed are still paper based).

If you are curious about your organization’s exposure to Information Overload, visit our Information Overload Calculator.  The calculator allows you to estimate the impact of the problem on your own organization.

So far, well over 5000 people, in industries ranging from advertising to zoology, have determined their exposure.  If you haven’t yet put a dollar value to your exposure, please fasten your seatbelt and try it yourself.  You’ll be glad you did.

David M. Goldes is the president of Basex.

Enterprise Social Networking: Some thoughts from the Online Community Unconference 2009

Thursday, February 19th, 2009 by Cody Burke

Last week I moderated the Social Networking in the Enterprise session at the Online Community Unconference East 2009 in New York.

The theme for the session was Social Networking in the Enterprise.  We discussed trends in social networking that are both internal and external to the enterprise.  In attendance were over 15 knowledge workers from a variety of organizations including Crowd Fusion, IBM, Leader Networks, Leverage Software, McKinsey, MediaVision, Ramius, SAP, Social Intent, Symphonic Consulting, and Time among others.

Here is what we discussed.

Despite the proliferation of social networking, many organizations remain clueless in this area.  Ultimately most companies want to use social networking to improve collaboration and knowledge sharing but they are not sure as to how to proceed.  In addition, many organizations feel pressured to use public social networks for marketing purposes, but they typically do not have a clearly defined set of goals in mind.

It is also important to recognize that building a social networking presence requires a lot of work behind the scenes.  Just because everyone else has a corporate Facebook page does not mean that it is right for your company.  Clearly, more thought needs to go into the benefits of developing a social networking presence in the context of an organizations identity and its own requirements.

One thing was clear (at least to me), companies that develop social networking tools for the enterprise will need to educate decision makers about the benefits of social networking tools in order to gain traction in the marketplace.

Another interesting topic was that of expertise location, something Basex has reported on extensively.  Many knowledge workers experience difficultly in finding subject matter experts, i.e. a Russian speaker or someone who understands how to deploy a specific software solution, and view social networking tools as a possible solution.  Another interesting trend is that some companies are considering deploying fairly sophisticated social networking tools although they have not yet deployed fairly basic community and collaboration tools (such as instant messaging).  That type of leap may not work very well for their knowledge workers.  Social networking tools add an additional level of complexity that some may not be quite ready for.

In terms of knowledge sharing, we heard that many knowledge workers are still information hoarders and have not learnt that there is tremendous value in sharing information with colleagues.  If an organization can’t get past this obstacle, it will not be able to compete successfully in the knowledge economy, where knowledge sharing is, of course, de rigeur.

The foregoing was just a brief overview.  As with most good discussions, more questions were raised than there was time to answer them, but the quality of both people and ideas that were present was refreshing, and we at Basex look forward to continuing this conversation.

Cody Burke is a senior analyst at Basex.

FiOS Follies

Wednesday, February 18th, 2009 by Jonathan Spira

First announced in July 2004, Verizon FiOS couldn’t come to my neighborhood in New York City soon enough. Using fiber-optic connections instead of copper wire to bring telephone service, Internet, and television into the home, FiOS (which stands for Fiber Optic Service) was certainly worth the wait. So was the pain of the installation process and problem solving that followed.

After five hours plus, and a call for a more experienced installer, my FiOS service was up and running – more or less.

The installation consists of bringing the fiber-optic connection into the home and terminating it in an optical network terminal (ONT), which serves as an interface to inside wiring for telephone, television, and Internet access.

The TV service itself is superb, with better picture quality than our cable company (Time-Warner) had ever provided. The multi-room DVR (digital video recorder) system allows streaming of recorded programs (HD and standard) to other TVs in the home. Widgets provide local traffic and weather and local and national news on the top of the screen while programs continue in a slightly smaller size below.

The FiOS Interactive Media Guide has an easy-to-use tabbed interface and allows searching for words that appear anywhere in the description. One can remotely program the DVR via the Web (or using a Verizon mobile phone). The service features over 100 HD channels, 500 all-digital channels, and 14,000 video-on-demand titles (8,500 are free).

The Internet service is lightning fast. It consistently measures close to 20 Mbps, about seven times faster than my DSL service ever was. It’s so fast that my partner and I can each watch a different streaming TV show on our respective computers without any problem (with DSL, one show was frequently more than the service could handle).

It was the plain, old telephone service (known in the industry as “POTS”) that turned out to be the big problem. The day after installation, I noticed that many of my calls were not going through; instead, after dialing, I would hear an ACB recording (“We’re sorry, all circuits are busy…”). After weeks of investigation, this turned out to be a software error; my phone line was coded as an account disconnected for non-payment. I also found that I couldn’t place a call a few times a day; pressing the number pad would simply not break the dial tone. Then a reorder tone (sounds like a fast busy signal) would follow, then a message stating “if you’d like to make a call, please hang up and try again.” I told the repair bureau it was a bad line card but they didn’t seem to believe me. This problem took over two months to resolve and involved dozens of phone calls and the implementation of odd fixes at the phone company’s suggestion (twice they had me unplug all of my phones and they replaced the ONT and also sent a technician to check the inside wiring). Two months later, the problem was determined to be a bad line card in the Nortel softswitch.

A few small glitches remain to date. The remote set-top box loses the connection to the main DVR several times a day and it also has trouble playing recorded programs longer than 30 minutes. In such cases, it loses track of where it is. (Verizon promises a fix for the first problem shortly and advises that the second problem is being worked on.) In addition, the problem in placing a call mysteriously returned for two days recently and then disappeared again.

By this time, you are probably wondering if getting FiOS is worth it – and my answer is a resounding “yes.”

The clear sharp television picture and the lightning fast Internet connectivity are simply head-and-shoulders above any other service I have seen and I saved the best for last. Even with faster speed and sharper picture, I’m saving money. A bundle including TV, Internet plus telephone service is $99.99 per month plus taxes and fees (previously I was paying 60% more for inferior service).

Jonathan B. Spira is the CEO and Chief Analyst at Basex.

Rekindling the Flame – Amazon Introduces Kindle 2

Tuesday, February 10th, 2009 by Jonathan Spira

When the original Amazon Kindle was introduced, I tried very hard to like it.  While there were many things that it did well (see my original review), the reader experience was ultimately unsatisfying.  At the time of its introduction, however, the Kindle was certainly the latest and probably greatest eBook reader, a concept that goes back to Sony’s introduction of the Bookman in 1991 and the Sony Data Discman in 1990.

The original Bookman weighed two pounds and could play full-length audio CDs.  It was, essentially, an 80286-based, MS DOS-compatible computer with a 4.5″ monochrome display.  Even before the Bookman, Sony had introduced the Data Discman Electronic Book Player.  The Discman weighed only 1.5 pounds and books had to be created using the Sony Electronic Book Authoring System.  Its three-hour battery life, relatively low resolution, and limited content greatly limited its utility and, ultimately, its lack of success.

All of these designs, including the newest Kindle, overlook the rather profound question of what makes for a satisfying book-reading experience.

It all boils down to the fact that reading a book is just that, something one does with paper.  No amount of searchable text, clickable links, and video wizardry will replace that experience, and putting a table of contents, page numbers, and an index around words that come to the reader electronically is a different reading experience.

Books also have other advantages, including a drop-proof, shock-proof chassis, extremely low power consumption, and a bulletproof operating system.

What we read from did migrate once before. By the end of antiquity, the codex had replaced the scroll.  The codex user interface was improved over time with the separation of words, use of capital letters, and the introduction of punctuation, as well as tables of contents and indices.  This worked so well, in fact, that 1500 years later, the format remains largely unchanged.

With the original Kindle, the reader experience, while light-years ahead of reading a book on a laptop, was still greatly lacking compared to the pleasure readers continue to derive from paper books (it appears we are at the cusp of having to create a retronym, “paper books,” to describe the non-eBook variety).  My 1996 “invention” of the Lazerbook , an in-home device that printed books on demand on reusable paper, has still not yet been built but I suspect that, were it to arrive on the scene today,  readers would still prefer paper.

This week Amazon introduced Kindle 2.  Although units are not yet available for purchase (although Amazon is accepting pre-orders now) or for testing, I suspect that I will like this Kindle a whole lot more.  In addition to the new Kindle, Amazon said it would start to sell e-books that can be read on non-Kindle devices including mobile phones.  It also announced an exclusive short story by Stephen King.

Kindle 2, sporting a new design with round keys and a short, joystick-like controller, has seven times the memory of the original version, a sharper display, and it turns pages faster.  Despite these improvements, the price remains the same: $359.  At the launch, Amazon CEO Jeff Bezos told the audience that “our vision is every book, ever printed, in any language, all available in less than 60 seconds.”  Amazon also announced Whispersync, a feature that allows the reader to start a book on one Kindle and continue where he left off on another Kindle or supported mobile device.

Apple and Google, not traditional book publishers, represent the greatest challenge to the Kindle beyond, of course, the codex.  Google has, to date, scanned millions of books, many out of print and hence not easily available in traditional form.  Readers can find several e-book programs online for the iPhone and iPod Touch.

What will the future hold? Check with me in, say, 1500 years.

You can order the new Kindle from Amazon.

Jonathan B. Spira is the CEO and Chief Analyst at Basex.

E-mail and the Network Effect

Thursday, February 5th, 2009 by Jonathan Spira

E-mail usage is a very good example of the network effect, which describes the effect that one user of a good or service has on the value of that product to others and is usually thought to be a positive thing (the telephone would be useless if only one or two people had such a device but the more people who own telephones, the more valuable each telephone is to its owner).

E-mail of course has benefitted from the network effect.  When e-mail was first invented, there was a limited number of users on the Arpanet and who could send and receive messages.   When MCI Mail and Compuserve’s mail system were connected to the NSFNET in the late 1980s, this first commercial use of Internet-based e-mail expanded the base of users greatly and the value of e-mail increased commensurately as well.

Just as networks become congested at some point after achieving critical mass (an excellent example was MCI’s long-distance network in the 1980s, when the company sold the service to more customers than its nascent network could handle, leading to busy signals and incomplete calls), a negative network effect can ensue.

Today, this is happening in e-mail as resources (mostly the time knowledge workers can allocate to e-mail) are becoming increasingly constrained while knowledge workers continue to pump more and more e-mail into the system, further exacerbating the problem.  Making matters even worse is that the quality of e-mail messages is frequently lacking when compared to more formal correspondence such as a memo or letter.

Next week we’ll look at other issues relating to e-mail overload.

Jonathan B. Spira is the CEO and Chief Analyst at Basex.

Google Glitch: Human Error the Culprit

Sunday, February 1st, 2009 by Jonathan Spira
The Google warning Saturday morning

The Google "warning" Saturday morning

A glitch in the Google search service caused the company to warn users – including me early Saturday morning – that every Web site listed in the results could cause harm to their computer.

While doing a search on Google at that time (yes, my work-life balance has been decimated), I noticed something funny about Google’s results.  Every result included a disclaimer that “[T]his site may harm your computer.”  Fearing a virus or other malware (although I couldn’t see how it could possibly have this effect), I tried several other computers including a Mac running Safari.  All searches, regardless of topic, computer, and browser, returned similar warnings.  In addition, although they were present and highlighted in green, the links to the actual Web sites were not clickable.

The problem seemed to last for about an hour.

Google later acknowledged on its blog that all searches during that time period produced links with the same warning message.

The warning was not limited to English

The warning was not limited to English

“What happened?” Google explained in the blog. “Very simply, human error.”  Unbeknownst to most of us, Google does maintain a list of sites that install malware on visitors’ computers in the background.

The list of sites is periodically updated and Google released an update Saturday morning.  This is where the human error comes in.  A Google employee included the URL of “/” in the list and “/” is part of all URLs.  Google caught this problem fairly quickly; according to the company, the maximum duration of the problem for a given user was ca. 40 minutes.  It seemed to impact  me a bit longer than that but then the problem disappeared.

Fortunately, I made several screen captures of the error for posterity.

Google does have a reputation for an extremely reliable service although errors do creep in from time to time.  Last month, a glitch in Google Maps sent drivers travelling within Staten Island on a 283-kilometer detour to Schenectady.

Jonathan B. Spira is the CEO and Chief Analyst at Basex.

Clicking on a link led to this page on Saturday.

Clicking on a link led to this page on Saturday.