» Archive for July, 2009

Information Overload in Government: Costly and Dangerous

Wednesday, July 1st, 2009 by Cody Burke

Information overload is typically thought of as impacting large corporations. Indeed, the needs of the public sector have seemingly been ignored in such discussions (including those by Basex).  For a variety of reasons, government entities are in many respects far less prepared to deal with the problem than their corporate equivalents.

In a typical corporate environment the primary focus is naturally on turning a profit.  As a result, when presented with hard numbers that show the impact the information overload is having on the bottom line (Information Overload costs the U.S. economy ca. $900 billion per year), smart managers have little choice but to consider how to reduce their exposure to the problem and recapture lost revenue.

Public sector organizations, however, operate with a level of bureaucracy that leaves even the largest corporation looking like a mom-and-pop shop.  Due to the complicated and somewhat politicized nature of budgets and contracting, as well as the not-for-profit operating model of government programs that may prioritize job creation and services rendered over efficiency, it is far more difficult for government entities to move with alacrity to address these issues.

Any discussion of information overload in a government context should incorporate two key points, finance and national security.

As with any organization, be it public or private, the overabundance of information that confronts knowledge workers directly impacts their ability to do their jobs in an efficient and effective manner.  In an era of extreme belt tightening and budget cuts at all levels of government, there is no better time to address information overload in this context.

Basex estimates that local, state, and federal governments spend at least $31 billion managing information each year.  Indeed, it is almost impossible to work for the government and not spend significant time managing information. In a recent survey of government and education workers conducted by Xerox as part of a study on public sector e-mail use, 58% of those surveyed reported spending nearly half of their average workday filing, deleting or sorting paper or digital information.

A second and decidedly more sinister problem is of information overload’s impact on national security.  For instance, misuse of e-mail can be deadly to government agencies.  In early January of 2009, a cascading series of reply-to-all e-mail messages in the State Department snowballed and nearly shut down the e-mail system in what amounted to a self-administered denial of service attack.  The matter was not taken lightly: a warning was sent to all State Department employees promising unspecified “disciplinary actions” for using the reply-to-all function on e-mail with large distribution lists.  My colleague Jonathan Spira is preparing a case study on a similar occurrence that took place at Maxwell Air Force Base, ironically during a conference on cyber-security.

These instances demonstrate how vulnerable government agencies, including those tasked with the mission of protecting our country, can be and how threats can come from the simple misuse of a common communication tool.

Trouble exists on structural and cultural levels as well. Intelligence and law enforcement agencies depend on knowledge sharing to spot threats and create actionable intelligence, yet the ability for the various agencies to share that knowledge is hamstrung by outdated and somewhat nonsensical classification systems, incompatible tools, and a culture that promotes extreme siloing of information.  This stovepipe mentality, where information moves up or down in a hierarchal manner, not horizontally to where it is needed, fails to match the innovative, network-based threats that we face in the form of non-state actors, terrorism, and organized crime.

Additionally, the sheer volume of content has increased significantly as the intelligence community rightly begins to shift away from a culture of secrecy, where classified information is deemed to have more value simply by virtue of its classification, to a more open model that leverages Open Source Intelligence (OSINT) assets (anything legally available, including newspaper articles, social content, and conference proceedings).  In 2005, following the recommendation of the 9-11 Commission and the WMD Commission, the Office of the Director of National Intelligence (ODNI) created the Open Source Center (OSC) to collect, analyze, and disseminate open source intelligence, as well as train analysts to take full advantage of these resources.

There have been a few bright spots in government information sharing, such as the Intellipedia, a Wikipedia-style information resource developed for the U.S. intelligence community in 2005, and the Center for Army Lessons Learned, a resource for military personnel to share best practices and local knowledge.  Sadly, the overall effort of the military and intelligence community to address information issues has not yet caught up to the efforts that are being made in the private sector.

Just as government is not immune from information overload, it is also not immune from taking steps that will reduce the extent of the problem and at the same time improve information sharing and collaboration.  We’re just scratching the surface here (we’ll have more research on this topic in the coming months) but we would love to get feedback and comments from readers in these sectors.

Cody Burke is a senior analyst at Basex.

Kodachrome Requiem

Wednesday, July 1st, 2009 by Jonathan Spira

Last week’s announcement that Eastman Kodak would “retire” Kodak film left many photographers feeling nostalgic, although few apparently still were purchasing the product.  Its passing deserves far more than a quick refrain of Paul Simon’s Kodachrome song although younger generations may be left to wonder if another technology is taking their JPEGs, TIFFs, and GIFs away.

We take more photographs than ever before, thanks in part to the fact that photography, sans film and processing costs, has become almost free.  But this comes at a price: while the earliest photographs (the word photograph means “light images”) such as Daguerreotypes and ambrotypes are still visible to the naked eye today, many photographs taken within the past 25 years since the advent of electronic photography outside of the laboratory are no longer accessible.

I first realized this in 1999 and 2000 when I was researching information for the Filmless Photography chapter of the book I co-authored, The History of Photography.  Some of NASA’s earliest pictures of earth have long since become inaccessible.  Digital file standards come and go (try opening up a WordStar or early WordPerfect file on your laptop today).  So little had been written about electronic photography’s early days that my book turned out to be the first book on photography’s history to document that facet.  (The era of still-video cameras, the first generation of filmless cameras that launched in the 1980s, seems to be all but forgotten – we have the first two still-video cameras, the Canon RC-701 (ca. 1984) and the Canon RC-760 (ca. 1987) in The Spira Collection as well as many other cameras that followed in that category but I doubt I could easily retrieve any images still stored within).

The first consumer-grade color digital camera, the Apple QuickTake (who here didn’t know Apple made cameras?) came with QuickTake software on a 3.5″ diskette.  The diskette will certainly be an historical curiosity by the year 2011.   The first professional digital camera, the Kodak DCS (ca. 1991), which was built on a Nikon F3 body and the first digital camera to take images of any reasonable quality (the first digital camera to be sold, the Dycam Model 1, produced images that were suitable for newspaper-quality halftones up to 5×4″), came with a digital storage unit (DSU) that contained a 200 MB hard drive that could hold 160 images.  The Spira Collection has the Kodak and Apple cameras as well – I suppose I should, in the interest of science, plug them in and see what I find.

I’ll leave you with the final two sentences from The History of Photography:  “There will always be some form of recording light images; it is a science that has taken centuries to evolve. What shape it will take in the future has yet to be determined, but each technological advance in photography has served to broaden and deepen its reach.”

Jonathan B. Spira is the CEO and Chief Analyst at Basex.


google