» Archive for the 'Information Management' Category

The Christmas Day Terrorism Plot: How Information Overload Prevailed and Counterterrorism Knowledge Sharing Failed

Monday, January 4th, 2010 by Jonathan Spira

There is no question that analyzing mountains of information and determining what is important, urgent, and worthy of follow-up (three separate and distinct categories) is a daunting task in any organization.

Are we sharing all of our knowledge yet?

Are we sharing all of our knowledge yet?

When the organization is the United States Federal Government and the amount of information that has to be addressed daily dwarfs what most people can conceptualize, lives may be at stake when an individual or system fails to connect the dots.

Such a failure occurred on December 25, 2009, but it need not have.

The tools to manage information on a massive scale do indeed exist and it is clear that the U.S. government is either not deploying the right ones or not using them correctly.

The National Counterterrorism Center, created in 2004 following recommendations of the 9/11 Commission, has a mission to break “the older mold of national government organizations” and serve as a center for joint operational planning and joint intelligence.  In other words, various intelligence agencies were ordered to put aside decades-long rivalries and share what they know and whom they suspect.  Unfortunately, while this sounds good in theory, in practice this mission may not yet be close to be being fully carried out.

In addition to the fact that old habits die hard (such as a disdain for inter-agency information sharing), it appears that the folks at the NCTC failed to grasp basic tenets of knowledge sharing, namely that search, in order to be effective, needs to be federated and contextual, that is to say it needs to simultaneously search multiple data stores and present results in a coherent manner.

Discrete searches in separate databases will yield far different results compared to a federated search that spans across multiple databases.  All reports indicate that intelligence agencies were still looking at discrete pieces of information from separate and distinct databases plus the agencies themselves were not sharing all that they knew.

In this case, much was known about Umar Farouk Abdulmutallab, the Nigerian man accused of trying to blow up Northwest Flight 253.  In May, Britain put him on a watch list and refused to renew his visa.  In August, the National Security Agency overheard Al Qaeda leaders in Yemen discussing a plot involving a Nigerian man.  In November, the accused’s father warned the American Embassy (and a CIA official) in Abuja that his son was a potential threat.  As a result, the son was put on a watch list that flagged him for future investigation.  He bought his plane ticket to Detroit with cash and boarded the flight with no luggage.  Yet, almost unbelievably, no one saw a pattern emerge here.

Shouldn’t a system somewhere have put the pieces of this puzzle together and spit out “Nigerian, Abdulmutallab, Yemen, visa, plot, cash ticket purchase, no luggage = DANGER!”?

Information Overload is partially to blame as well.  Given the vast amount of intelligence that the government receives every day on suspected terrorists and plots, it could very well be that analysts were simply overwhelmed and did not notice the pattern.  Rather than being immune from the problem, given the sheer quantity of the information it deals with, the government is more of a poster child for it.

Regardless of what comes out of the numerous investigations of the Christmas Day terrorism plot and the information-sharing failures of the various intelligence agencies, one thing was abundantly clear by Boxing Day: the Federal Government needs to greatly improve its ability to leverage the intelligence it gathers and connect the dots.

Clearly, there are many changes that need to occur in order to improve security but one relatively simple way for the government to proceed is to take the first steps to lower the amount of Information Overload and raise the signal-to-noise ratio so that critical information can rise to the top.

Jonathan B. Spira is CEO and Chief Analyst at Basex.

New Year’s Resolutions for the Information Overload-Minded

Tuesday, December 29th, 2009 by Jonathan Spira

It took radio 38 years to reach 50 million people and television only 13 years.

caption goes here

Here's to less information!

It took the Internet a mere four years to reach that number.  Just last month there were 10 billion Web searches performed and most people didn’t find what they were looking for.

Indeed, 50% of all searches fail at first blush and, to make matters worse, 50% of the searches believed to be successful fail, unbeknownst to the user, to some extent as well.

Clearly something has to be done.  With the New Year almost upon us, we can start with a few simple New Year’s Resolutions that are time tested in lowering the amount of Information Overload we all face.

1.)    Learn better search techniques.  Control search results using Boolean logic by typing AND or OR and use advanced options to narrow the field.

2.)    Use restraint in communications.  Don’t cc the world, don’t include more people than necessary in any communication, avoid gratuitous “thanks” and “great” replies, and avoid reply-to-all at all costs.

3.)    Write clearly.  Better yet, refrain from combining multiple themes and requests in one single e-mail.  And make sure the subject is specific as opposed to general (writing “Help needed” without further details helps no one, especially the recipient).  These simple steps will add instant clarity with little effort.

4.)    Read what you write – before you click send.  Unclear communications result in excessive and unnecessary back-and-forth communications that would have been unnecessary were the first missive unambiguous and to-the-point.

5.)    Read what others write – before replying.  While it would be nice to believe that people will place the most important information at the very beginning, often times the key facts are buried in the closing paragraphs.  What you are about to ask may already have been covered.

6.)    Value your colleagues’ time as if it were your own.  If a response to an e-mail is not immediately forthcoming, don’t pick up the phone or send an IM saying “did you get my e-mail?”.

Happy New Year!  Prosit Neujahr!

Jonathan B. Spira is CEO and Chief Analyst at Basex.

Search: How to Find What You Are Looking For (or 5 Tips for Better Search)

Thursday, December 17th, 2009 by Jonathan Spira

50% of all searches fail in a manner that the person doing the search recognizes as a failure. 

cloud

What is it that you are looking for, my dear?

A far more significant problem is that 50% of the searches believed to have succeeded failed, but the person doing the search simply doesn’t realize it.  As a result, that person uses information that is at best out of date but more often incorrect or just not the right data.  When the “bad” information is then used in a document or communication, there is a cascading effect that further propagates the incorrect information.

In an age where Information Overload costs the U.S. economy ca. $900 billion per annum, finding the right information has become far more critical.

To increase the odds that you will find what you are looking for, we’ve prepared five simple search tips that should result in better and more accurate results, regardless of where you are searching.

1.)    Boolean logic
Search engines typically use a form with a search box into which one types the search query.  To control the search results, use Boolean logic by typing AND or OR.  Many search engines including Google default to AND when processing search queries with two or more words.  To exclude words, use NOT (java NOT coffee, java -coffee).  For increased relevance, use NEAR (restaurants NEAR midtown Manhattan).

2.)    Options
Most search engines include options (on Google, these are found by clicking on Advanced Search).  Use options to narrow down the field you are searching.  Examples include file format (.ppt, .doc, .pdf, etc.) or Web site (basex.com).

3.)    Search tools
When it comes to search, one size does not fit all.  Use a variety of search tools beyond Google.  Try search visualization tools such as Cluuz and KartOO on the Web and KVisu for behind the firewall.

4.)    Meta search engines
A meta search engine runs several searches simultaneously.  Tools that may be helpful include Clusty and Dogpile.

5.)    Archived (out-of-date) materials or nonexistent Web sites
The Wayback Machine on the Internet Archive is useful for both older versions of Web pages and sites that have disappeared over time.

Jonathan B. Spira is CEO and Chief Analyst at Basex.

In the briefing room: Simplexo

Thursday, November 19th, 2009 by Cody Burke

Knowledge workers spend a good part of their day in search of information; therefore it is no surprise that having the correct search tools is of paramount importance.

Simplexo search simultaneously addresses structured and unstructured data.

Simplexo search simultaneously addresses structured and unstructured data.

The limitations that exist in many search tools, combined with poor search techniques, lead to the frequent use of outdated information, the recreation of content that exists but can not be found, and the waste of significant amounts of time.

Failed searches are a very visible symptom of Information Overload.  It is generally acknowledged that 50% of searches fail outright but few realize that 50% of the searches that people believe to have succeeded actually failed too, in that they presented stale, incorrect, or simply second-best information.  This last figure is far more insidious because the knowledge workers are unaware of the searches’ failure and blithely proceed to use the incorrect information in their work.

Training and proper search techniques can make a huge difference in improving search results but equally important are the tools the knowledge worker uses.  In most cases, information is stored in separate silos, and search tools need to be able to reach across those boundaries.  A search query that reaches across multiple information stores at once provides far more complete and relevant results than multiple separate searches.

Simplexo is one company that is addressing these issues with its Simplexo Enterprise search offering.  The company accesses the native indexing capabilities of databases and existing software such as SharePoint and Outlook to index unstructured data.  Indexing takes place in real-time and runs continuously, which helps ensure that the most current information is presented in search results.  Data is de-duplicated to remove extra copies of information, reducing the overall amount of content that must be indexed.

Simplexo uses a dual index approach that looks at both structured data in real-time and unstructured data during processor idle time.  The system examines and retrieves unstructured data from sources such as Web pages, e-mail, text files, PDF files, and Open Office files, as well as data from structured sources such as databases, business applications, CRM applications, and information portals.  The wide net that Simplexo casts when searching has the potential to improve knowledge worker efficiency. The ability to look in multiple silos and repositories with a single search query is extremely helpful in ensuring that information that is buried in a far flung repository or inbox folder is taken into consideration.

In addition to indexing, Simplexo supports native integration into platforms such as browsers, Office, Outlook, AutoCAD, and Lotus.  This integration is key to enabling the knowledge worker to remain in one work environment.  Simplexo also supports mobile devices via Simplexo Mobile for iPhones and Windows Mobile 6 devices, with plans to add BlackBerry support in the future.

Enterprise search solutions such as Simplexo take a realistic view of the challenges faced by knowledge workers when searching for information.  They recognize that relevant information can reside in any repository, be structured or unstructured, and is in need of continuous indexing to remain up-to-date.

Cody Burke is a senior analyst at Basex.

A New Measure of Information Overload – In Feet

Thursday, November 12th, 2009 by Jonathan Spira

It was right in front of me but I never noticed it until an in-depth conversation with a very well-informed CEO of a major auto maker earlier this week: how to measure Information Overload in a meaningful way.

How much information received today, dear?

How much information received today, dear?

“We send our dealerships,” the CEO told me, “about a foot or so of information every day.  There’s no way anyone can digest all of it.”  How did he measure this? The company printed out every piece of paper that goes out to the many dealerships around the country and that’s how high the average stack was.

This reminded me of an experiment the EDP (electronic data processing) manager, Dave Stemmer, tried at the company where my father was CEO, probably around 25 years ago (when IT departments were still called EDP departments).  He noticed that the department printed out dozens and dozens of reports a day (and the reports were on the green striped computer paper in binders) and wondered how many were actually being read.  So he stopped printing the reports and waited for the phone to ring with someone requesting them.  Apparently only 10% of the reports were re-requested so the waste in computer time (when this was a valuable commodity) and paper was huge.

Stemmer’s experiment, while less focused at the problem of Information Overload, does demonstrate man’s proclivity in creating too much information (or written versions of that information) that will go unused.

In the case of our auto maker, the amount of information was a wake-up call and the company is not only looking to reduce the amount of information sent to its dealerships but also looking to find ways of making that information more useful and relevant.

We know from our research here that the cost of Information Overload is great and that the actions of individual knowledge workers in terms of what they send to colleagues and correspondents can exacerbate an already bad situation.  Looking at it from a “how much does our organization send out en masse to individuals and partners” perspective is another way of trying to get not only a fix on the costs but also a good way of finding ways in which a few feet of Information Overload can be eliminated.

Jonathan B. Spira is CEO and Chief Analyst at Basex.

In the briefing room: Mindjet Catalyst

Thursday, October 15th, 2009 by Cody Burke

Collaboration should be a given in practically every task a knowledge worker undertakes.

Mindjet Catalyst

Mindjet Catalyst

Frequently, however, it isn’t and, in many cases, where collaboration does take place, it is not used to its best advantage.   Part of the reason for this is due to much collaborative work taking place without a full a picture of the project at hand.

Indeed, there exist different dimensions to collaboration and there is a significant need to connect knowledge workers, the collaborative process itself, and the organization with relevant complex information, ideas, and processes.  Given the trend towards both a dispersed workforce and the need for collaboration among multiple entities, the need to effectively manage a project requires new approaches to joining people with information.

One approach that will make collaboration between knowledge workers more effective is to ensure that the supporting information is captured in a form that adds context and is easily shareable.  To add context, information must be linked to people, documents, and other supporting content.  One method of doing this is to create a mind map.  Mind mapping is a technique for brainstorming and organizing data that starts with a central point and then adds branches with related content such as links, documents, attachments, notes, and tasks.  The resulting diagram is a visual guide to a set of information that allows knowledge workers to see the big picture and understand the context of what they are doing.

One company active in this space is Mindjet.  The company has made its reputation through the development of mind mapping products, such as its MindManager product line.  Adding further value to the company’s mind mapping capabilities, Mindjet recently launched Mindjet Catalyst, an online visual collaboration platform.  The offering has clear roots in Mindjet’s visual approach to mind mapping that the company is know for, and adds a team-based collaborative element.

Catalyst is an online service and can be accessed from anywhere via any Web browser and hooks into standards-based document repositories such as SharePoint.  Multiple users can make edits and attach supporting documents and other content to a mind map and have the changes reflected in real time.  The offering also includes pre-built map templates for common business situations such as online meetings or idea generation sessions.  Once maps are generated, they are shareable with colleagues (both users of Catalyst and with those who do not use the product) via links that are e-mailed or posted on social networking sites.  Workspaces are assigned with permission levels to assign reader, author, and owner access.  In addition, the environment is persistent, meaning that users are able to see changes that have occurred.

Catalyst also features integrated online chat functionality, and (optional) Web conferencing capabilities.  The integrated online chat embeds community into the work environment and allows for communication between colleagues without forcing them to leave the environment and switch tools.  The Web conferencing module includes desktop sharing, video and VoIP support, file transfer, and session recording.

Mindjet has taken a good and underappreciated idea, the visual mapping of information, and successfully integrated into it collaborative capabilities and tools.  Displaying information in a visual and connected way gives the knowledge worker context that is critically important for making informed decisions, capturing new information, and understanding business processes.  The addition of powerful collaborative elements extends the value of mind mapping by allowing knowledge workers to use the environment for the kind of collaborative team-based work that is a reality in the knowledge economy.

Cody Burke is a senior analyst at Basex.

Change Afoot in the Content Management Space

Thursday, October 1st, 2009 by David Goldes

Content management systems are taking on increasing importance in organizations of all sizes.

The content management market is seeing dramatic change thanks to new open source and commercial open source entries that are making significant inroads with customers. In addition, just to make things a bit more complex, companies need to prepare to manage multiple forms of content including wikis, blogs, RSS feeds, social networks, podcasts, and video.

This in turn has significantly changed the process of selecting a content management solution, a process that was never exactly straightforward as it requires both an in-depth understanding of both the organization’s needs and what the market has to offer.

Consider that companies that spend hundreds of thousands of dollars for content management systems might do equally well with platforms that cost one-tenth that amount.

Content management is no longer a nice-to-have tool; given the critical role of content (in all of its forms) in the enterprise, CM platforms have now been accorded the status of essential IT infrastructure.  That’s why one sees names such as EMC, IBM, and Oracle in the space.

Basex estimates that the U.S. market for content management was $4.1 billion in 2008 and will reach $10 billion by 2014.  Open source content management is gaining traction in some circles and the overall open source software market is growing rapidly.

The increase in our reliance on content and the amount of content that is being created in the enterprise makes it even more critical that companies manage content effectively in order to avoid the problem of Information Overload.

To help companies navigate the space, Basex just released The Definitive Guide to Today’s Content Management Systems and Vendors, a 150-page report series.  The report series looks at 32 key content management vendors and 43 platforms and provides in-depth analysis — including market trends, drivers, and barriers — to guide decision makers in the selection process.

The good news is that companies today can find a wide range of content management systems at varying price points.  The bad news is that selecting the RIGHT platform is more critical than ever to a company’s future and most companies don’t have the resources to thoroughly investigate their options.  Managers have to understand the total cost of ownership, support options and functionality when making that decision.

The report series is being published on a subscription basis and includes an in-depth industry survey, Content Management Systems: The New Math for Selecting Your Platform, and 16 Vendor Profiles of key content management providers and their offerings.

The vendor profiles provide a comprehensive analysis of content management offerings from Autonomy, Acquia, Alfresco, Bluenog, Day Software, EMC, EpiServer CMS, FatWire, Hippo, IBM, Microsoft, MindTouch, Nuxeo, Oracle, Open Text and Xerox.

You can purchase the report at a special introductory price from the Basex Web site.

David M. Goldes is the president of Basex.

In the Briefing Room: eDev inteGreat

Thursday, October 1st, 2009 by Jonathan Spira and Cody Burke

Many people think of software development as lone programmers working in isolation, perhaps reminded of Douglas Coupland’s 1995 classic Microserfs, where programmers slide flat foods, such as “Kraft singles, Premium Plus Crackers, Pop-Tarts, grape leathers, and Freeze Pops” under the door of a fellow coder after they hadn’t seen him in days.  In reality, the process of software development is a collaboration-intensive activity that would benefit greatly from improved knowledge management technology and thinking, much in the way knowledge sharing and collaboration happen between workers in far less technical occupations.  Unfortunately, many managers fail to realize the necessity of actively managing knowledge and facilitating collaboration in this area.

Companies typically spend vast amounts of time and money to document their requirements and it is far from easy to keep such documentation up to date.  At the same time, they struggle to find ways to interrelate information, given that such information comes from diversified sources.  In other words, how does one create a document that leverages information that is anywhere and everywhere and still be able to make sense out of it?

One company that provides a tool in this area is eDev Technologies via the company’s inteGreat offering.  The product is a requirements management solution that allows for the creation and reuse of requirements through the development of a central body of knowledge, which the company refers to as iBoK (Integrated Body of Knowledge).  This knowledge base is a collection of reusable requirements.  InteGreat allows developers to create requirements using a drag-and-drop interface and then relate them to one another to aid in reuse.

Requirements are then visually mapped out as process flows using MS Visio, and are saved either as inteGreat files or exported as Visio files.  Users also have the ability to create mockups using an included simulation tool.  Once a process is created, generated documents are exported via MS Word, Excel, or Visio, or saved within inteGreat.

As in any form of knowledge work, the recreation of content, in this case requirements, is a huge and costly problem, and is essentially a problem of finding things and avoiding recreating that which already exists.  If the knowledge worker can not find information, be it a document or a requirement, they will have to recreate it, increasing project costs, squandering limited resources, and impacting an organization’s bottom line.  The end result of enabling the reuse of requirements is that, for future projects, there will be a reduction in the time and cost of gathering requirements, as well as lessening the burden of maintaining software.

In inteGreat, the ability to reuse requirements once they are developed adds a much needed knowledge management aspect to the development of requirements, affording software developers the same KM capabilities that other knowledge workers now take for granted.  In turn, as more companies adopt similar solutions, they will see increases in efficiency and a reduction in the time spent recreating requirements.

Jonathan B. Spira is CEO and Chief Analyst at Basex.
Cody Burke is a senior analyst at Basex.

Information Overload – It Isn’t Just Too Much E-mail

Thursday, August 20th, 2009 by Jonathan Spira

One might assume that pinpointing the sources of Information Overload is relatively black and white, i.e. it’s just too much e-mail. In reality, nothing could be farther from the truth.

The problem of Information Overload is multifaceted and impacts each and every organization whether top executives and managers are aware of it or not.  In addition to e-mail, Information Overload stems from the proliferation of content, growing use of social networking tools, unnecessary interruptions in the workplace, failed searches, new technologies that compete for the worker’s attention, and improved and ubiquitous connectivity (making workers available anytime regardless of their location).  Information Overload is harmful to employees in a variety of ways as it lowers comprehension and concentration levels and adversely impacts work-life balance.  Since almost no one is immune from the effects of this problem, when one looks at it from an organizational point-of-view, hundreds of thousands of hours are lost at a typical organization, representing as much as 25% of the work day.

So what else besides e-mail overload is at issue here?  Here’s a quick rundown.

- Content
We have created billions of pictures, documents, videos, podcasts, blog posts, and tweets, yet if these remain unmanaged it will be impossible for anyone to make sense out of any of this content because we have no mechanism to separate the important from the mundane.  Going forward, we face a monumental paradox.  On the one hand, we have to ensure that what is important is somehow preserved.  If we don’t preserve it, we are doing a disservice to generations to come; they won’t be able to learn from our mistakes as well as from the great breakthroughs and discoveries that have occurred.  On the other hand, we are creating so much information that may or may not be important, that we routinely keep everything.  If we continue along this path, which we will most certainly do, there is no question that we will require far superior filtering tools to manage that information.

- Social Networking
For better or worse, millions of people use a variety of social networking tools to inform their friends – and the world at large – about their activities, thoughts, and observations, ranging down to the mundane and the absurd.  Not only are people busily engaged in creating such content but each individual’s output may ultimately be received by dozens if not thousands of friends, acquaintances, or curious bystanders.  Just do the math.

- Interruptions
We’ve covered this topic many times (http://www.basexblog.com/?s=unnecessary+interruptions) but our prime target is unnecessary interruptions and the recovery time (the time it takes the worker to get back to where he was) each interruption causes, typically 10-20 times the duration of the interruption itself.  It only takes a few such interruptions for a knowledge worker to lose an hour of his day.

- Searches
50% of all searches fail and we know about the failure.  What isn’t generally recognized is something that comes out of our research, namely that 50% of the searches you think succeeded failed, but the person doing the search didn’t realize it.  As a result, that person uses information that is perhaps out of date or incorrect or just not the right data.  This has a cascading effect that further propagates the incorrect information.

- New technologies
We crave shiny new technology toys, those devices that beep and flash for our attention, as well as shiny new software.  Each noise they emit takes us away from other work and propels us further down Distraction Road.  It’s a wonder we get any work done at all.  Even tools that have become part of the knowledge workers’ standard toolkit can be misused.  Examples here include e-mail (overuse of the reply-to-all function, gratuitous thank you notes, etc.) and instant messaging (sending an instant message to someone to see if he has received an e-mail).

Jonathan B. Spira is CEO and Chief Analyst at Basex.

Information Overload Awareness Day

Thursday, July 9th, 2009 by Jonathan Spira

“What can we do to call more attention to the problem of Information Overload?” is a question I hear almost daily from managers at companies who have recognized the extent to which the problem impacts their organizations.  As of now, I have a much better answer than I previously had: participate in Information Overload Awareness Day, a new workplace observance that calls attention to the problem of information overload and how it impacts both individuals and organizations.

Yes, you can wear a button or a T-shirt (we’ll have those next week) but that’s only the first step.  On August 12, the day we’ve set aside to focus our attention on the problem, we are holding an online event that will permit us to do a deep dive into different ways that Information Overload is adversely impacting knowledge work and knowledge workers while also spotlighting possible solutions to help managers and policymakers cope with loss of productivity.

Information Overload describes an excess of information that results in the loss of ability to make decisions, process information, and prioritize tasks.  Organizations of all shapes and sizes have already been significantly impacted by it; according to our research the problem costs the U.S. economy $900 billion per year in lowered productivity and throttled innovation.

The event features a variety of speakers including noted authors Maggie Jackson (“Distracted”) and Mike Song (“The Hamster Revolution”), executives from such companies as Dow Jones and Morgan Stanley, a CIO from the U.S. Air Force, and Nathan Zeldes, president of the Information Overload Research Group and the former executive in charge of addressing the problem at Intel.  (I’ll be there too, of course.)

While a few people put their heads in the sand and say this is not a real problem, the costs are quite real and the problem is only going to get worse.  By 2012, the typical knowledge worker will receive hundreds of messages each day via e-mail, IM, text, and social networks.

Simply put, companies need to focus on what can be done to lessen information overload’s impact right now.  We’ll look at the latest research and solutions and cover areas including managing e-mail, calculating Information Overload exposure, improving search, and managing content, just to name a few.

The cost of the event is $50; attendees who promise not to multi-task (i.e. IM, e-mail, or text) during the event will receive a 50% discount.

Companies are invited to sponsor Information Overload Awareness Day by enrolling as Designated Sites.  This allows all of their employees to attend at no charge and demonstrates their commitment to helping solve the problem.

Tweet this: Information Overload Awareness Day Aug. 12; event to present latest research and solutions; http://www.informationoverloadday.com/

Jonathan B. Spira is the CEO and Chief Analyst at Basex.


google