» Archive for October, 2009

Seeking the Forest of Experts Through the Trees

Thursday, October 29th, 2009 by Jonathan Spira

The problem of Information Overload has significantly exacerbated the problem of finding an expert.

Where are the experts?

Where are the experts?

Do a search and you’ll get 564,768 results.  Of course searches only address explicit information.  Most information is tacit knowledge that resides in the minds of experts.  When those experts leave the office in the evening, they take that knowledge with them.  Of course, some may never return and take their expertise to a new job and/or employer.

There are myriad ways knowledge workers use to locate an expert, but two stick out in my mind:  1.) asking a few close colleagues and getting a limited number of answers, and 2.) sending an “all-hands” e-mail to hundreds or thousands of colleagues in which one’s query is stated.  In thousands of companies across the land, e-mail messages that state “does anyone know anything about…” or “does anyone know anyone who knows the VP of marketing at…” are commonplace.

Of course, the all-hands method is very disruptive and adds to the problem of Information Overload.  An e-mail query that should only have gone to a handful of colleagues but went to 500 cost the company 1.7 days (ca. 40 hours) in lost man-hours when you calculate the impact of the interruption to 488 people who didn’t have to receive it.  Of course, that e-mail dance probably happens multiple times a week despite the results of this search technique being modest at best.

Expertise location tools, which have been around for well over a decade, have been unable to keep up with what people know (no surprise there) and who the experts are.  A Basex research report from 2002 profiled expertise location and management platforms from eight companies.  Five of them have disappeared, one was purchased by Oracle, and the other two (Lotus and Sopheon) were not focused solely on solving the problem.

Social software may give us more breadcrumbs in determining the answer to the “who knows what” question but much work is still needed to create a platform that integrates expertise location into commonly-used enterprise tools (both to locate as well to rate experts).

Before that, however, one should focus efforts on lowering the overall amount of Information Overload within an organization, as doing that will make it much easier to see the forest of experts through the trees.

A wealth of Information Overload resources including a three-minute movie on the topic featuring senior executives discussing how Information Overload impacts them may be found on our Information Overload microsite.

Jonathan B. Spira is CEO and Chief Analyst at Basex.

In the briefing room: DotNetNuke

Thursday, October 29th, 2009 by Cody Burke

Finding a content management system that fits your needs is far from simple.

DotNetNuke's Marketplace

DotNetNuke's Marketplace

Indeed, as content creation skyrockets and organizations increasingly need to offer robust Web sites and portals for both internal and external use, the options become dizzying.  The ability to customize and develop channels such as Web sites, intranets, and community portals is increasingly attractive and necessary in a competitive market, no matter what business a company is in.

An offering that provides those kinds of customization options is DotNetNuke, a versatile open source development platform.  The DotNetNuke project, and eventually the company, evolved out of a modified version of Microsoft’s IBuySpy Portal that was released in early 2002 under a liberal end-user license agreement allowing modification.  By late 2002, Shaun Walker, who would go on to found DotNetNuke, released his own modified version that added features and sparked an active and vibrant open source developer community.  The project was renamed DotNetNuke in February 2003 and DotNetNuke Corp. was incorporated in September 2006.

DotNetNuke is an open source content management and application development framework for the Microsoft .Net software framework.  Like other commercial open source vendors, DotNetNuke has grown up around a specific product, in this case the .Net software framework.  The company offers a free Community Edition, and sells Premium and Elite Editions that include expanded features sets and support options.  At its core, the platform is designed to enable users to build Web sites that are customizable through use of open source modules and skins (basic reusable HTML files for graphical presentation that have placeholders for content) that the company provides via its online marketplace.

The platform includes modules for login, announcements, blogs, chat, events, FAQs, feedback, forms and lists, forums, help, newsfeeds, reports, search, site logs, surveys, users and roles, and wikis.  From there, users can customize the system by using modules and skins that an active community of developers and partners maintain.  A visit to www.snowcovered.com (which was recently acquired by DotNetNuke and replaces the company’s own marketplace), reveals a thriving ecosystem of third party modules and skins offering everything from event calendar and registration, video gallery, and document library modules and an expansive selection of skins for tweaking the look of a Web site.

When considering commercial open source solutions, the number of active developers and community members is reflective of the health of the project. What is attractive about DotNetNuke is the large and thriving ecosystem that, when paired with the modular approach the company takes with the platform, gives organizations the ability to set up sites and have a wide range of options for customizing them for their specific needs. This makes DotNetNuke a platform that will end up on more and more organizations’ short lists.

Cody Burke is a senior analyst at Basex.

Fall Back: Europe Moves to Winter Time, U.S. Changes Clocks Next Weekend

Saturday, October 24th, 2009 by Jonathan Spira

Europe and the U.S. fall back - at different times

Europe set its clocks back one hour to “winter time” at 1 a.m. GMT Sunday.  Next Sunday morning, while most people are still asleep, the United States and parts of Canada will switch back to Standard Time at 2 a.m. local time.

The November change is in accordance with the Energy Policy Act of 2005 and once again is one week later than in years before the 2005 Act.  If you don’t think that these changes are a big deal, change the time on your laptop by an hour and see what happens.  The impact of this seemingly minor change extends well beyond computers, to legions of business travelers and mobile knowledge workers, among others.

For the coming week, the United States will be out of sync with a good part of the rest of the world (most of Asia, Africa, and South America do not observe Daylight Saving Time at all).

Last March, as if to illustrate this specific point, I discovered that a recurring bi-weekly meeting that was scheduled by a colleague based in Israel mysteriously moved to noon EDT on my calendar for its two occurrences in March.  For meetings in April, it remained at the original time, 11 a.m. EDT.

Daylight Saving Time is a system of managing the changing amounts of daylight that occur during the year, with a goal of maximizing daylight hours during typical waking hours.  It was first proposed by Benjamin Franklin in 1784, who believed it would save an “immense sum.”  It was not broadly adopted until the early twentieth century when the U.S. temporarily enacted Daylight Saving Time as an energy-saving measure.

By adjusting clocks ahead by an hour in the spring, people can have more daylight available during the workday.  For example, in the case of someone who typically awakens at 7 a.m., since in the spring the sun rises earlier each day, an individual would have to wake up at 6 a.m. to take advantage of the additional daylight.  Instead, by moving the clock ahead by one hour, that person can continue to wake up at 7 a.m. and enjoy more daylight in the evening hours.

Prior to 2005, the last change to the Daylight Saving Time schedule was in 1986, when legislation changing the onset of Daylight Saving Time from the last Sunday in April to the first Sunday in April was enacted.

But recent studies indicate that the energy savings may be illusory.  One study demonstrated how a switch to Daylight Saving Time across the entire state in April 2006 cost Indiana households an additional $8.6 million in electricity.  Another study suggested that the temporary extension of daylight-saving in two Australian territories for the 2000 Summer Olympics increased energy usage.

On the other hand, the American Council for an Energy-Efficient Economy, a nonprofit group, estimated that the cumulative benefit of the change through the year 2020 will be a savings of ca. $4.4 billion and 10.8 million metric tons less carbon sent into the environment.  According to the U.S. Department of Transportation, for every day we are on Daylight Saving Time, we trim one percent of the country’s electrical consumption.

Most devices including laptops and desktop computers (not to mention servers), should have been updated by now but it still pays to double check.  These systems ranged from automated wake-up systems in hotels to systems that schedule airline crew members and slot aircraft for gates.  In addition, many computer-to-computer systems might have also been impacted.

Remember that Daylight Saving Time is not observed in Hawaii, American Samoa, Guam, Puerto Rico, the Virgin Islands, and Arizona (with the exception of the Navajo Nation).  Until 2006, the counties in the Eastern Time Zone of Indiana did not observe Daylight Saving Time and remained on standard time year round.  As of April 2006, all of Indiana observes Daylight Saving Time.

Oh, and get a good night’s sleep.

Jonathan B. Spira is the Chief Analyst at Basex.

In the briefing room: Confidela WatchDox

Thursday, October 22nd, 2009 by Cody Burke

Creating content is easy; however, managing the distribution of that content in a secure and traceable way is problematic to say the least.

Confidela WatchDox

Confidela WatchDox

Simply e-mailing documents is not the answer, once the document leaves the outbox, all control and visibility is lost.  Additionally, solutions that do exist for sending documents securely often insert friction into the knowledge worker’s routine; some require an additional application to encrypt or send a document, ultimately lessening the likelihood of the solution being used.  For a tool to be used and widely adopted, it must be seamlessly incorporated into the knowledge worker’s existing toolset.

One company that is offering a solution to the problem of secure document sharing is Confidela, with its Software-as-a-Service WatchDox offering.  The product is a tool for sending documents securely from within Outlook via a plug-in or, alternatively, from a Web interface.  The Outlook plug-in adds a button in the UI when composing a new message.  The plug-in can be set as a manual option, an automatic suggestion whenever attachments are sent, or as fully automatic whenever a document is sent outside of a company.  Attachments are replaced with WatchDox links that the recipient clicks on to securely view the document.  When sending, the system prompts the sender to define policies, give permissions, and determine recipients.  The attachment is then pulled into a separate outbox, converted to a WatchDox link, and sent.

To receive a document, a recipient goes through a one-time authentication for their computer, similar to the way many banks do, with the computer’s footprint saved.  Users access the document via a link delivered in an e-mail, and the document appears blurred out when the focus is not on it.  According to the company, this feature should prevent screenshots and the like from being taken.  For the sender, a My Docs view provides usage information for documents that have been shared and sent out, what actions recipients have taken, any action required, and metadata surrounding the documents.

WatchDox is hosted on Amazon’s EC2 cloud Web service, and all documents are encrypted with a unique key.  For further security, Confidela keeps access controls separate from storage, and the company does not have access to those controls.

WatchDox impressed us with its ease of use and the fact that it works within existing tools without introducing additional friction between the knowledge worker and software.  Particularly as an Outlook plug-in, the ability to either set WatchDox as optional or automatic grants the users control while at the same time increasing the likelihood of use by locating it in the primary domain of the knowledge worker, the inbox.

Cody Burke is a senior analyst at Basex.

Everybody Loves a Cloud

Wednesday, October 21st, 2009 by Jonathan Spira
cloud

Is cloud computing just the latest fad?

The IT industry loves a new buzzword, even if it’s just a new word and not a new concept. Cloud computing is the buzzword du jour and software vendors small and large want everyone to enjoy life in the cloud.

But is cloud computing really that new? Or is it a move back to centralized systems, reminiscent of the days when the mainframe was king with a dash of Software-as-a-Service added?

Some cloud resources are available as a utility (metered service, conceptually similar to a public utility such as the electric company) or as a service (billed monthly or annually regardless of usage). The concept of utility computing dates back to the 1960s and was in use in the 1990s for asynchronous transfer mode (ATM) networks. As for the cloud itself, as Amy Wohl recently reminded me, a quick look at network slides from the 1980s show clouds sitting above every network. I recall the term “telecom cloud” being used to describe VPN networks deployed by telephone companies because the routing for data traffic was, for all intents and purposes, somewhat cloudy.  And Larry Ellison of Oracle asked,  at a recent Churchill Club presentation, if it were true that companies such as Google were based on “water vapor.”

Needless to say, there are some clear and unequivocal advantages to cloud computing. A company can deploy IT resources on demand with little or no up-front cost. Therein lies the rub. While the barrier to entry (for end-users) is minimal, the usage or monthly fees continue ad infinitum, unlike licensed software for which there is only a one-time cost. Granted, larger organizations pay maintenance and support charges for their enterprise software, so this may be a greater expense to smaller organizations that have not paid such fees in the past. Companies that migrate a good part of their IT infrastructure to the cloud can eliminate significant costs for hardware and upkeep, not to mention IT personnel.

Cloud computing has also enabled new business models. Indeed, the rise of mobile computing and smaller smart devices (such as netbooks and smartphones) – devices with little storage and relatively few installed applications – may serve to drive the next generation of cloud computing applications. Cloud computing has also allowed software companies to develop applications tailored to individual consumers (as opposed to large companies) and this is yet another area where we are just scratching the surface in terms of innovation. It’s important to keep in mind that some knowledge workers within larger organizations may use applications from the cloud intended more for consumers and may ask for enterprise services from the cloud based on their experiences; others may sneak in applications (under the radar of their IT departments) that they need to do their work, making the distinction between enterprise and consumer cloud apps a bit blurry.)

Despite the hype, cloud computing comes with some baggage. There is no agreement on standards or one single architecture. In fact the one thing for which there is consensus is that there is no true consensus as to what cloud computing really is.

This doesn’t mean that managers shouldn’t investigate whether cloud computing might work for their organizations. But before they do, there are three factors to consider.

First and foremost is vendor lock-in, a concept familiar to many enterprise software buyers. Competing cloud providers have their own standards and formats, many of which are incompatible with one another. Indeed, even the simple (in concept) task of moving data from one cloud to another is fraught with peril.

Second is storage. While large vendors such as Salesforce.com have built up considerable trust with the user community, there are new entrants to the cloud arena every day that haven’t earned their stripes. Is your organization’s data secure? Are proper security precautions in place, both in terms of on line access as well as physical egress? For free services, what do the vendor’s terms of service allow it to do with personal data?

Last, and possibly most important, is the data stored in the cloud safe from loss? While cloud providers have gone down from time to time (Google’s Gmail service has had seven outages so far this year), the point of safety was driven home quite strongly last week by the T-Mobile Sidekick debacle, in which tens of thousands of users lost myriad personal data (including address books and photo albums, among other assets), while said data was entrusted to the aptly-named Danger subsidiary of Microsoft.

Jonathan B. Spira is CEO and Chief Analyst at Basex.

E-mail: Reports of My Demise are Premature

Thursday, October 15th, 2009 by Jonathan Spira

It is both premature and foolhardy to proclaim that e-mail’s reign as “king of communications” is over as a recent Wall Street Journal article trumpets.

E-mail remains the most-used corporate communications tool despite reports to the contrary.

E-mail remains the most-used corporate communications tool despite reports to the contrary.

Not that e-mail is the best communications medium for everything; indeed we know very well it isn’t.

Instead, e-mail has, in the past 15 years in particular, become that path of least resistance for almost everything that transpires within an organization.

Update status? Send an e-mail to a few hundred of one’s closest colleagues.

Finish a report? Send another e-mail to a few hundred of one’s closest colleagues.

The fact is that we use e-mail opportunistically rather than with an understanding as to what the impact of its use might be.

Sending that status report to those few hundred colleagues actually cost the organization ca. 24 hours in lost time when one calculates the few minutes each person spent opening the e-mail he didn’t need to receive in the first place – plus the “recovery time,” which is the time it takes to get back to where one was in the task that was interrupted.

The result of all of our communications (and it isn’t just e-mail) is Information Overload, a problem that costs the U.S. economy ca. $900 billion per annum.  On August 12,  Information Overload Awareness Day was observed around the world with meetings and discussions.  But that’s just one day – each additional day that we don’t address the problem of Information Overload and take steps to lessen its impact costs billions.

Companies can take steps to lower their exposure to Information Overload (an article about what can be done may be found at here) but even raising awareness of the problem and understanding the impact of overusing such tools as e-mail can make a big difference.

Jonathan B. Spira is CEO and Chief Analyst at Basex.

In the briefing room: Mindjet Catalyst

Thursday, October 15th, 2009 by Cody Burke

Collaboration should be a given in practically every task a knowledge worker undertakes.

Mindjet Catalyst

Mindjet Catalyst

Frequently, however, it isn’t and, in many cases, where collaboration does take place, it is not used to its best advantage.   Part of the reason for this is due to much collaborative work taking place without a full a picture of the project at hand.

Indeed, there exist different dimensions to collaboration and there is a significant need to connect knowledge workers, the collaborative process itself, and the organization with relevant complex information, ideas, and processes.  Given the trend towards both a dispersed workforce and the need for collaboration among multiple entities, the need to effectively manage a project requires new approaches to joining people with information.

One approach that will make collaboration between knowledge workers more effective is to ensure that the supporting information is captured in a form that adds context and is easily shareable.  To add context, information must be linked to people, documents, and other supporting content.  One method of doing this is to create a mind map.  Mind mapping is a technique for brainstorming and organizing data that starts with a central point and then adds branches with related content such as links, documents, attachments, notes, and tasks.  The resulting diagram is a visual guide to a set of information that allows knowledge workers to see the big picture and understand the context of what they are doing.

One company active in this space is Mindjet.  The company has made its reputation through the development of mind mapping products, such as its MindManager product line.  Adding further value to the company’s mind mapping capabilities, Mindjet recently launched Mindjet Catalyst, an online visual collaboration platform.  The offering has clear roots in Mindjet’s visual approach to mind mapping that the company is know for, and adds a team-based collaborative element.

Catalyst is an online service and can be accessed from anywhere via any Web browser and hooks into standards-based document repositories such as SharePoint.  Multiple users can make edits and attach supporting documents and other content to a mind map and have the changes reflected in real time.  The offering also includes pre-built map templates for common business situations such as online meetings or idea generation sessions.  Once maps are generated, they are shareable with colleagues (both users of Catalyst and with those who do not use the product) via links that are e-mailed or posted on social networking sites.  Workspaces are assigned with permission levels to assign reader, author, and owner access.  In addition, the environment is persistent, meaning that users are able to see changes that have occurred.

Catalyst also features integrated online chat functionality, and (optional) Web conferencing capabilities.  The integrated online chat embeds community into the work environment and allows for communication between colleagues without forcing them to leave the environment and switch tools.  The Web conferencing module includes desktop sharing, video and VoIP support, file transfer, and session recording.

Mindjet has taken a good and underappreciated idea, the visual mapping of information, and successfully integrated into it collaborative capabilities and tools.  Displaying information in a visual and connected way gives the knowledge worker context that is critically important for making informed decisions, capturing new information, and understanding business processes.  The addition of powerful collaborative elements extends the value of mind mapping by allowing knowledge workers to use the environment for the kind of collaborative team-based work that is a reality in the knowledge economy.

Cody Burke is a senior analyst at Basex.

In the briefing room: Teleplace 3.0

Thursday, October 8th, 2009 by Cody Burke

The Teleplace 3.0 environment.

Meetings, particular the online variety, can be dull, tedious, and, most importantly, not terribly productive for participants.  This may very well have something to do with the medium and the manner in which the meeting is conducted.  In a typical online meeting, the main speaker may share his screen with the attendees, roll through a slide deck, perhaps demonstrate an application, and solicit feedback (in online meetings this occurs via the fairly rudimentary tools found in most meeting environments).

The limitations of this kind of approach to meetings are significant: a single two-dimensional interface common to all participants and a lack of a connection between participants due, in part, to a lack of visual cues.  In addition, online meeting rooms typically differ from their real-life counterparts in that materials and files are typically not stored in them.  Many meetings are ongoing; participants meet several times a week or month and need to update materials in between, as well as to be able to return to a virtual room and have the needed materials in one place, in the state in which they left them.

As anyone who has read Snow Crash knows, the concept of using virtual environments for business use is not new.  Organizations as varied as IBM and the U.S. Army have explored the possibility of using virtual worlds for training, meetings, and collaboration.  During the recent Second Life land grab, enthusiasm for which has since died down, that virtual world was flooded by companies establishing virtual properties for marketing and customer outreach.  Ultimately however, the perception of virtual worlds and environments as a toy, not a tool, has proven difficult to shake.

One company that is pushing the business case for virtual environments is Teleplace, née Qwaq.  The recent name change was part of a shift the company is taking to make clear its focus on enterprise customers.

Teleplace 3.0 is the latest version of the company’s online environment for meetings, training sessions, visualization, and virtual operations centers.  Teleplace has been designed from the ground up as a business environment first, and a 3-D virtual world second.  Spend as little as an hour in Teleplace (I’ve spent several already), and you will see it is suited for serious business.  In Teleplace, business applications exist in a persistent state on virtual walls and displays.

Teleplace can accommodate different sized meetings: small meetings with a handful of people allow for complete interaction amongst participants.  Virtual lecture halls can handle up to 60 people and, if more attendees are expected, can support a broadcast mode that can go to thousands.  Participants can use a laser pointer to direct everyone’s attention to objects or specific areas of a chart.  Meeting leaders can bring people into rooms or areas, and also conduct polling and control communications.

There are many features in Teleplace that effectively demonstrate that virtual environments can be an effective business tool.  Teleplace goes beyond the traditional meeting environment and provides tools that have the potential to introduce greater efficiencies into the workplace.  One example is the persistence of the environment; this is a huge step up from traditional online meetings; an attendee can view a shared chart or slide show on a display wall, move to another area to interact with other attendees, and then simply return to the wall to view the chart again.  Environments that have been populated with content, such as video clips, slide decks, documents, and integrated business applications, remain in place, enabling users to drop in and out and later return to the same work area.

Virtual work environments may in some ways remind us of their toy predecessors, but offerings such as Teleplace 3.0 remind us that they are in fact powerful business tools.

Cody Burke is a senior analyst at Basex.

U.S. v. IBM Round 3: DOJ Starts Antitrust Inquiry

Wednesday, October 7th, 2009 by Jonathan Spira
The Department of Justice is looking into potential antitrust violations by IBM

The Department of Justice is looking into potential antitrust violations by IBM

For the third time in 60 years, the United States government has started an inquiry into possible monopolistic practices by IBM in the mainframe computer market.   Antitrust regulators from the Department of Justice have been contacting companies (including members of the Computer and Communications Industry Association, which filed a complaint about IBM with the Department) about IBM’s business practices in the space.  The association, which is supported by IBM competitors including Google Microsoft, and Oracle, claims that IBM has stymied competition in the mainframe market and blocked attempts by others to license IBM’s software.

The mainframe business comprises a significant part of IBM’s revenue; including storage systems and professional services, it adds up to at least 25%.  IBM has seen its rivals withdraw from the market as the company continued to innovate with more advanced systems.  Last week, a civil suit by IBM competitor T3, which resold computers that behaved like mainframes, was dismissed.  The court’s ruling stated that the fact that IBM had invested heavily in advancing mainframe technology without licensing it to others “does not constitute anticompetitive behavior.”

In 1952, the U.S. Government alleged that IBM had violated Sections 1 and 2 of the Sherman Antitrust Act in part because IBM only leased, and would not sell, tabulating machines.  By 1955, IBM had adopted similar practices for mainframe computers.  The matter was settled in 1956; the Final Judgment required IBM to sell as well as lease computers.  In 1969, the DOJ filed a complaint that IBM was in violation of the Sherman Act by attempting to monopolize the business computer market.  The trial lasted over six years but the complaint was withdrawn in 1982 by the DOJ, stating that the charges were “without merit.”

Jonathan B. Spira is the CEO and Chief Analyst at Basex.

Nortel Liquidation Continues: Company to Sell Optical Unit to Ciena

Wednesday, October 7th, 2009 by David Goldes

Nortel is in the last phases of liquidation.

As part of the continuing saga of the Nortel liquidation, the company announced agreements covering the global sale of its Optical Networking and Carrier Ethernet businesses to Ciena, a network infrastructure company that has competed fiercely with Nortel in the past.  “We believe we are best positioned to leverage these assets, thereby creating a significant challenger to traditional network vendors,” said Gary Smith, Ciena’s CEO.

Under the supervision of bankruptcy courts in the U.S. and Canada, Nortel and its principal subsidiaries, which filed for bankruptcy in January, have entered into a “stalking horse” asset sale agreement with Ciena for its North American, Caribbean and Latin American (CALA) and Asian Optical Networking and Carrier Ethernet businesses, and an asset sale agreement with Ciena for the Europe, Middle East and Africa (EMEA) portion of its Optical Networking and Carrier Ethernet businesses for a purchase price of $390 million in cash and 10 million shares of Ciena common stock.

The agreements cover Nortel’s OME 6500, OM 5000, and CPL platforms, its 40G/100G technology, the related services business, and all patents and intellectual property that are predominantly used in these businesses.  The agreements also provide for the transfer of almost all of Nortel’s customer contracts to Ciena.

The company announced that at least 2,000 employees, more than 85 percent of the workforce of the units being sold, would be offered employment with Ciena.

As in any stalking horse sale, these agreements are far from final.  Nortel’s stalking horse agreement with Nokia-Siemens for its wireless unit ended with Ericsson as the acquirer back in July. The company’s enterprise unit was sold to Avaya, also in July.

On September 30, Nortel announced it will accept bids for its global GSM/GSM-R business.  GSM (Global System for Mobile communications) is the most popular wireless technology standard for mobile phones in the world. GSM-R is a technology that provides a secure communications system for railways operators.

David M. Goldes is president and senior analyst at Basex.


google