» Archive for January, 2010

What Aspects of Information Overload Impact You the Most?

Thursday, January 28th, 2010 by David Goldes

Information Overload is never far from our thoughts here at Basex but, with the cost of the problem looming at some $900 billion per annum, it’s sometimes possible to lose sight of impact this scourge has on each individual.caption

We’re trying to document individual experiences with and impact arising from Information Overload and would appreciate your help.

Please rate the top three factors or contributors to Information Overload (in terms of the impact on you personally) and tell us how and why they impact you and to what extent.  We will keep your response(s) private and anonymous.

Please e-mail your responses privately to ioanswer@basex.com and we will report back to you in the coming months.

Jonathan B. Spira is CEO and Chief Analyst at Basex.  He can be reached at jspira@basex.com

Searching for Needles in Haystacks: How our brain sabotages our searches

Thursday, January 28th, 2010 by Cody Burke

In a recent study funded by the U.S. Department of Homeland Security (DHS) and reported in LiveScience, researchers found that subjects’ expectations of finding something had a direct effect on their success rates for finding the items in question.

Found it yet?

Found it yet?

In the study, subjects looked at X-ray scans of checked baggage and tried to identify the presence of guns and knives.  In the first trial, a gun or knife was present in 50% of the bags, and subjects only missed the weapons 7% of the time.  In the second trial, the guns and knives were in only 2% of the bags, and the subjects missed the weapons 30% of the time.  In short, when something is harder to find, our accuracy in identifying it drops significantly.

This is a trick our brain is playing on us as it becomes bored when we do not find what we are looking for and stops paying attention, meaning we then miss things when they do appear.

While the implications for airline security are obvious and somewhat chilling, the implications for the enterprise are also worth examining.  Knowledge workers spend ca. 15% of their day searching for content.  Applying the lessons learned in the DHS study, we can assume that if a search query returns fewer correct results in relation to incorrect results, the knowledge worker’s accuracy in picking out the relevant items will decline.

Conversely, just as in the DHS study, if the correct to incorrect ratio is better, meaning there is a higher number of correct results, then the knowledge worker is much more likely to find more of them.

For knowledge-based organizations and providers of software to these groups, the lessons from this study are clear: search tools must be improved to provide better ratios of relevant, useful results.  Today’s search tools focus on returning large sets of results and the answers to a search query may very well lie somewhere within these.  However, the low signal-to-noise ratio virtually ensures low accuracy even if one were to comb through every last result.

Search results need to be highly contextual and limited in volume to ensure accuracy and provide a favorable ratio of correct to incorrect results.  This keeps the knowledge worker engaged and not feeling that he is looking for a needle in a haystack; this, in turn, increases the probability of identifying the needed content.

Cody Burke is a senior analyst at Basex.

Apple’s iPad: Is This the Year of the Tablet?

Wednesday, January 27th, 2010 by David Goldes

Apple made its second foray into the keyboardless computer industry yesterday with the launch of the iPad.

Is this the year of the tablet?

Is it the year of the tablet yet, dear?

Similar devices have been around since the GRiDPad was introduced in 1989, although the GRiDPad tipped the scales at slightly over 2 kg.  Apple itself began selling the Newton as a PDA in 1996 but its handwriting recognition software and short battery life hampered its success. Microsoft’s Windows-based Tablet PC has enjoyed a modicum of success but it is mostly used by professionals such as nurses and insurance adjustors who are on the go for much of their day.

In addition, early tablets lacked today’s high-speed wireless networking capabilities as well as Internet content, which today are both more than plentiful.

With the iPad, Apple hopes to leverage the iPhone’s success and create a new category of gadgets.  The iPad supports Web browsing, e-mail, videos, music (it essentially has a built-in iPod), eBooks, as well as applications designed especially for the device.  It will also support almost all of the 140,000 applications in the Apple App Store.  The iPad uses a Multi-Touch interface and a large virtual keyboard (it can also be used with a traditional keyboard). It comes with a 9.7″ LED backlit display that provides a 178° viewing angle.  The machine will be supplied with either 16, 32, or 64 GB of flash storage, Wi-Fi and Bluetooth connectivity and, on higher priced models, the ability to connect to 3G networks.

Although there was much speculation about potential partners for the 3G connectivity, Apple will continue (for now at least) to rely on AT&T’s 3G network for the iPad in the United States, despite the many complaints iPhone users have had about their AT&T 3G service.

Apple’s iPad comes at a time where there are full-functioned netbooks on the market for under $300 – and these have a real keyboard.  Granted, they lack Apple’s vaunted UI but just how many portable devices do most people really need?  Apple is betting on customers going for a superior user experience and greater Net usage [the iPad uses flash memory and that gets expensive (the 64 GB model is $699].

Where the real impact may lie is in book, newspaper, and magazine publishing. Amazon offers the Kindle, a black-and-white eBook reader, that is the leader in what is essentially a small, niche market.  Amazon has been trying to branch out with an App Store-like offering but the superior (color) interface of Apple’s iPad could put it in the lead.

Publishers are looking to Apple to create a new model that will let them advertise and monetize their content.  Taking a different path from Amazon’s, Apple is allowing book publishers set their own prices (Amazon sets Kindle pricing).  Companies such as the New York Times and game-maker Gameloft are developing iPad-specific apps.

Still many questions remain.  Will the iPad reinvent traditional media?  Will consumers want to carry yet another device (the iPad is not a phone)?   Stock analysts are bullish on Apple and the iPad.  The company’s stock rose 1.5% yesterday to $208.99 and some analysts are predicting a high of as much as $285 over the coming 12 months.

In the briefing room: Avaya’s post-Nortel roadmap

Thursday, January 21st, 2010 by Cody Burke

One of the final chapters in Nortel’s history has now been written.  Nortel’s vaunted Enterprise Solutions unit has been acquired by Avaya, a company that, similar to Nortel, traces its origins back to Alexander Graham Bell’s original patent for the telephone in 1874.


The end of an era

This week, Avaya announced its roadmap for the integration and continuation of products and services from Nortel and it’s good news for customers of both companies as Avaya management has found a way to meld the best offerings from both companies into a unified set of products.  However, there are a few speed bumps ahead.

First, customers not on the platform that becomes part of the merged product line face a forklift upgrade and significant cost in the not so distant future as the product portfolios from Nortel and Avaya were largely proprietary and incompatible with one another.

In addition, based on how past mergers of similarly-sized tech firms have fared, Avaya faces multiple challenges as it integrates multiple platforms and workgroups while trying to maintain its ability to service its customers at the levels they require and are accustomed to.  In addition, Avaya expects to support its newly-expanded product portfolio with a newly-shrunk workforce.

The flagship unified communications offering for Avaya will be Avaya Aura.  Aura will be enhanced with the addition of Avaya (formerly Nortel) Agile Communications Environment (ACE) as well as the inclusion of technology from Nortel for a common management infrastructure.  For existing Nortel customers, Aura can be added and will sit on top of existing deployments.  Likewise, Aura customers can add Nortel solutions to their deployments.

In the roadmap, Avaya laid out a move towards a SIP-based system that is multimodal with an open rules engine and conference-based communications.  To this end, Avaya Contact Center Elite will continue as the flagship enterprise solution, and Nortel Contact Center 7 will remain as a mid market solution.  The release of Contact Center 8 will add features and technology from Contact Center Elite, with the ultimate goal of improving scalability in order to enable the company to offer one contact center solution to cover everything from the middle market to high-end deployments.

For the small- and mid-sized enterprise market, Avaya plans to continue to supply Nortel Business Communications Manager, Norstar, Partner, and Integral 5, but it will eventually merge these solutions into Avaya IP Office as the flagship hybrid offering.  Nortel’s Software Communication System will be the flagship offering for SIP environments.

With regard to data products, Avaya announced it will adopt the current roadmap of data products from Nortel, including offerings for Ethernet Switching, Routers, Wireless Networking, Access Control, and Unified Management.

What Avaya has released thus far is a roadmap and there are many details that have not yet been released that should clarify further what Avaya’s combined offerings will look like.  Avaya did have plenty of time to contemplate and prepare for the merger and, if nothing else, we give them an A+ for effort here.

Cody Burke is a senior analyst at Basex.

Multitasking Injuries Mount

Monday, January 18th, 2010 by Jonathan Spira

A recently-released study at Ohio State University showed that the number of pedestrians who have had to visit an emergency room because they became distracted and injured themselves doubled to 1000 in 2008 from 2007, which itself was nearly double the 2006 figures.

Catch me if you can!

Catch me if you can!

One doesn’t have to go far to find a pedestrian engrossed in some form of handheld electronic device these days as society and information both become more mobile.

While many believe that they are successfully multitasking, this couldn’t be further from the truth.  The inability for the human brain to multitask is something that most everyone wants to wish away but few recognize that it simply isn’t possible.  Our brains aren’t wired that way and this didn’t miraculously change with the dawn of the Information Age.

When we reviewed the research done for “The Cost of Not Paying Attention” five years ago, it came as no surprise to find out that knowledge workers lost as much as a quarter of the workday due to interruptions and a phenomenon known as recovery time.  Each interruption comes at a cost – namely the “recovery time” or time it takes to get back to where one was prior to the interruption – and is typically 10 to 20 times the duration of the interruption itself.

Instead of multitasking, what happens is a form of time slicing.  Instead of being able to handle multiple tasks at once, our brains stop and start multiple times trying to address each individual task.  The recovery time from these stops and starts adds up as well.

If only people would focus on the task at hand – whether it be writing a report or walking down the street – without trying to juggle multiple tasks, everyone would be a lot better off.  President Gerald Ford was once accused of being too stupid to walk and chew gum at the same time.  Perhaps he simply chose to focus on one task at a time.

Jonathan B. Spira is CEO and Chief Analyst at Basex.

Intelligence Gathering Meets Information Overload

Thursday, January 14th, 2010 by Cody Burke

In 2007, the Air Force collected ca. 24 years’ worth of continuous video from Predator and Reaper unmanned drones in the skies over Afghanistan and Iraq, a fact first reported by the New York Times in the last few days.

Shall we drone on?

Shall we drone on?

All video collected is watched live by a team of intelligence analysts, so this translates into ca. 24 years of analyst team time being used in one year.

The amount of data (and the amount of time spent watching the videos) will only grow as more advanced drones are deployed that can record video in ten (with future plans for up to 65) directions at once instead of the one direction that is currently supported.

The use of  UAVs (unmanned aerial vehicles) is not only an expanding phenomenon in the military but also domestically as the U.S. Customs and Border Protection agency and local police forces begin to use these tools.  The advantages are clear: pilots are not in danger, intelligence gathering capabilities are improved, and the ability to conduct strikes in remote areas is enhanced.

There are of course myriad issues that the use of UAVs for military operations present, ranging from humanitarian arguments that drone missile strikes are more likely to result in civilian casualties to political considerations about where they can operate, as seen in recent disagreements with Pakistan.

Complicating and contributing to these issues is the huge problem of how do deal with the flood of information that drones are returning to the analysts.

Mistakes such as falsely identifying threats can lead to unnecessary and potentially tragic civilian casualties, which could then inflame international public opinion and impair the ability for the military to operate effectively.  Likewise, missing a real threat because of Information Overload could also lead to fatalities.

The use of these tools will only increase, making it critical that we develop systems to organize, parse, tag, and act upon the data that is collected in an effective manner.  The Air Force in particular is working on this problem, but will have to move quickly to stay ahead of the mountain of incoming data.

Cody Burke is a senior analyst at Basex.

In the briefing room: Consumer Electronics Show 2010

Wednesday, January 13th, 2010 by David Goldes

The annual Consumer Electronics Show (CES) is so large that it defies categorization and can create a unique kind of Information Overload.

What's new in Vegas for 2010?

What's new in Vegas for 2010?

Indeed, many companies schedule multiple pre-CES briefings to ensure that they reach their intended audience (including us).  To spare you from such overload, here are three notable new products that we think you should know about.

HP Notebook Projector Companion
Despite great advances in projector technology, many meetings are marred by an inability to get an important presentation on screen (or on wall, for that matter).  Most people don’t travel with their own projector; rather, many rent projectors at hotels or meeting facilities (typically at exorbitant rates).  HP has a tiny yet powerful solution that weighs only 260 g yet it can project a high-quality 60-inch-wide image up to 2.5 m away.  The image is more than good enough for most meetings and much sharper than a pico projector.

Iomega v.Clone
Iomega, now part of EMC, an information storage and management company, offers a hard drive management utility that allows you to take a snapshot of your computer’s operating system, applications, and data files with you on an Iomega drive.  This means that mobile knowledge workers can access their data from any computer (as long as they have the drive with them) and can also make setting up a new or replacement PC much less of a chore.

Lenovo Skylight
Does the world need another category of tablet or laptop?  Lenovo is betting that it does and introduced the Skylight, which might best be described as a laptop crossbred with a smartphone.

The Skylight is always on, just like a smartphone, runs a version of Linux, uses Qualcomm’s Snapdragon chip, but does not run conventional laptop software, instead relying on a unique user interface comprised of live Web gadgets (it comes preloaded with 18, including ones for Gmail, YouTube, and Facebook) as well as a traditional Web browser.  It connects to both Wi-Fi as well as mobile broadband (AT&T will sell the Skylight to run on its 3G network).

David M. Goldes is president and senior analyst at Basex.

In the briefing room: Liaise moves into public beta

Thursday, January 7th, 2010 by Jonathan Spira

It is always interesting to come back to a new product as it moves through the beta process and see what has changed.

Not so dangerous liaisons

Not so dangerous liaisons

A few months ago we wrote about Liaise, an inbox add-on for Outlook that scans e-mail messages as they are being composed and creates a task list based on any action items it finds in the e-mail.

Liaise recently moved into public beta with the addition of several new features.  What we liked about Liaise when we first heard about it was that it captures the action items that lurk in every e-mail and keeps them from falling through the cracks.  With the public beta offering, the company has added some new features which improve its functionality and fine tune how the tool works.

Liaise has added calendar integration for Outlook so due dates for action items pulled from e-mail appear in the Outlook calendar, as well as in the calendars of mobile devices that are set up to synch with Outlook.  This is the logical step for the product and links tasks, e-mail, and the calendar together.  Not having the action items integrated into the calendar was not a major problem, but the tool’s utility is definitely enhanced with this feature.

Another addition to Liaise is the ability to control more of what is displayed in e-mail messages.  In some situations, it may be preferable to have an e-mail appear normal to the recipient, particularly if that person is not a Liaise user.  At other times, for instance if the e-mail is internal only and all recipients are using Liaise, it may be useful for information about the action items pulled from the e-mail to appear in it.  The private beta of Liaise displayed this information by default.  More control is almost always a good thing and this makes the tool more likely to be used.

Liaise also had added support for cloud-based synching of project information among teams.  Particularly useful for keeping partners, clients, and disparate project teams up-to-date on project and action item statuses this allows information on projects to be updated when changes are made, without the use of e-mail.  Updates to projects can also be condensed into a single e-mail, in the event that the knowledge worker wishes to see a list of changes in one place.  Anything that cuts down on overall inbox traffic is to be applauded, although we do have lingering concerns about combining items in a single e-mail, as something may get overlooked.

As Liaise moves through the beta process the company is adding features and tweaking the user interface.  From what we have seen so far, the company is focused on improving integration, control, and the ability to synch information between users.  We like Liaise and think it has the potential to fix at least several of the problems that e-mail is plagued with relative to project and task management.  Looking towards the general release of the product as it moves out of beta, first on our wish list for future enhancements would be the expansion of the tool beyond Outlook.

Cody Burke is a senior analyst at Basex.

The Christmas Day Terrorism Plot: How Information Overload Prevailed and Counterterrorism Knowledge Sharing Failed

Monday, January 4th, 2010 by Jonathan Spira

There is no question that analyzing mountains of information and determining what is important, urgent, and worthy of follow-up (three separate and distinct categories) is a daunting task in any organization.

Are we sharing all of our knowledge yet?

Are we sharing all of our knowledge yet?

When the organization is the United States Federal Government and the amount of information that has to be addressed daily dwarfs what most people can conceptualize, lives may be at stake when an individual or system fails to connect the dots.

Such a failure occurred on December 25, 2009, but it need not have.

The tools to manage information on a massive scale do indeed exist and it is clear that the U.S. government is either not deploying the right ones or not using them correctly.

The National Counterterrorism Center, created in 2004 following recommendations of the 9/11 Commission, has a mission to break “the older mold of national government organizations” and serve as a center for joint operational planning and joint intelligence.  In other words, various intelligence agencies were ordered to put aside decades-long rivalries and share what they know and whom they suspect.  Unfortunately, while this sounds good in theory, in practice this mission may not yet be close to be being fully carried out.

In addition to the fact that old habits die hard (such as a disdain for inter-agency information sharing), it appears that the folks at the NCTC failed to grasp basic tenets of knowledge sharing, namely that search, in order to be effective, needs to be federated and contextual, that is to say it needs to simultaneously search multiple data stores and present results in a coherent manner.

Discrete searches in separate databases will yield far different results compared to a federated search that spans across multiple databases.  All reports indicate that intelligence agencies were still looking at discrete pieces of information from separate and distinct databases plus the agencies themselves were not sharing all that they knew.

In this case, much was known about Umar Farouk Abdulmutallab, the Nigerian man accused of trying to blow up Northwest Flight 253.  In May, Britain put him on a watch list and refused to renew his visa.  In August, the National Security Agency overheard Al Qaeda leaders in Yemen discussing a plot involving a Nigerian man.  In November, the accused’s father warned the American Embassy (and a CIA official) in Abuja that his son was a potential threat.  As a result, the son was put on a watch list that flagged him for future investigation.  He bought his plane ticket to Detroit with cash and boarded the flight with no luggage.  Yet, almost unbelievably, no one saw a pattern emerge here.

Shouldn’t a system somewhere have put the pieces of this puzzle together and spit out “Nigerian, Abdulmutallab, Yemen, visa, plot, cash ticket purchase, no luggage = DANGER!”?

Information Overload is partially to blame as well.  Given the vast amount of intelligence that the government receives every day on suspected terrorists and plots, it could very well be that analysts were simply overwhelmed and did not notice the pattern.  Rather than being immune from the problem, given the sheer quantity of the information it deals with, the government is more of a poster child for it.

Regardless of what comes out of the numerous investigations of the Christmas Day terrorism plot and the information-sharing failures of the various intelligence agencies, one thing was abundantly clear by Boxing Day: the Federal Government needs to greatly improve its ability to leverage the intelligence it gathers and connect the dots.

Clearly, there are many changes that need to occur in order to improve security but one relatively simple way for the government to proceed is to take the first steps to lower the amount of Information Overload and raise the signal-to-noise ratio so that critical information can rise to the top.

Jonathan B. Spira is CEO and Chief Analyst at Basex.