» Archive for April, 2009

In the Briefing Room: Kosmix

Thursday, April 30th, 2009 by Cody Burke

Knowledge workers have traditionally had a love/hate relationship with search technologies.  The vast amounts of information that we must shift through to find data that is relevant to us in a given situation make search tools a necessity.  We love being able to quickly find the current price of a product and the location of stores selling it or the most recent article on a key competitor.  The flip side is that our search tools actually fail us most of the time; 50 percent of search queries fail outright (these we are aware of), and of the 50 percent that we believe succeed, a further 50 percent of those fail us in some way that we may not even realize.

Although we have largely resigned ourselves to a world of Google searches that return results instead of answers, there is no shortage of those who are laboring to reimage search and attempt to address some of its fundamental flaws.

One such company, Kosmix, is taking a slightly unorthodox approach: they are not even attempting to fix search.  Instead, Kosmix is targeting the way in which we browse topics, and leaving the navigation aspect of search (finding a specific Web site) to Google and its ilk.  By separating discovery and research from traditional search, Kosmix is attempting to divide and conquer the search problem by zeroing in on a key weakness of results-based searching, namely the presentation of the contextual information that surrounds a topic.

Kosmix’ core product is its eponymously-named Web site, currently in beta, which allows users to browse content by topic.  The content is pulled from around the Web and presented in modules; a search for netbooks, for example, yields a definition from Wikipedia, images from Google and Flickr, related question and answer threads from Yahoo Answers, reviews and guides from EHow, video content from Truveo and Blinkx, Google blog search results, content from tech-related Web sites, relevant Facebook groups, shopping options from EBay and Amazon, and a summary of related items such as specific brands of netbooks and related topics that can be drilled down.

For comparison, a Google search for netbooks resulted in 35,700,000 results, with the only organization of the links being small subsets for news and shopping.

The content that Kosmix presents may not please everyone; automated editorial choices are made as to where to pull content from on a query-by-query basis based on what is available, the value of a site, and the relevance of articles.  For example, the system takes a query then determines what video site’s content is best suited, based on relevancy and ratings on the site.  Kosmix acknowledges that the aggregated content is not always a perfect fit but is working to improve the system in order to deliver better results as it moves forward with the product.

Kosmix is a useful tool for research and discovery around a specific topic, and does a good job of presenting content in a manageable manner, from a broad variety of sources.  Leaving navigation to the established search companies is a wise move for Kosmix, as is demonstrating that there is a better way to find content online than Google searches that return results lacking in context, and more often than not, lacking the information we were looking for in the first place.

Cody Burke is a senior analyst at Basex.

The Last Frontier: In-flight Internet Access, Take 2

Wednesday, April 29th, 2009 by Jonathan Spira


American Airlines was the first U.S. airline to announce in-flight IntLogging in to American Airlines Gogoernet service for domestic flights.  The first (test) phase of the American Airlines Gogo Internet service started in the middle of last year on the company’s fleet of 15 767-200 aircraft, which fly its transcontinental routes.

Recently, the company announced it will expand the service to over 300 domestic aircraft (the service doesn’t work over the Atlantic or Pacific oceans).

I am writing this from American Airlines Flight 15, New York (JFK) to San Francisco (SFO).  Until today, I hadn’t had to take a transcon flight since Gogo was launched so I was excited to try out the new service (most of my flying in the past nine months was transatlantic).

The last flight I took with Internet service was back in 2005, when Lufthansa and several other airlines still offered the Boeing Connexion service.

Once we hit 10,000 feet (we’re now at our cruising altitude of 32,000 feet), I turned on my trusty Lenovo ThinkPad X300 and it immediately found several Gogo hotspots.  It took just a few minutes to log in and and purchase service for today’s flight (a Gogo representative was handing out 25% discount coupons during boarding, I should mention) and I chatted with customer service about how to use my BlackBerry Bold smartphone on the same account (all I have to do is log off from the laptop and then log in from the Bold).

Gogo really goes

Gogo really goes

So far I’ve done a speedtest, which showed a download speed of 1.55 Mbps (double what the Boeing Connexion service was able to offer) and checked e-mail,  and read news from several Web sites including  the New York Times and Wall Street Journal.  The flight attendant has already served warm nuts and drinks so I’m going to relax and enjoy the flight for a little bit and then report again.


We’re still at 32,000 feet, just crossing over Minneapolis.

Purchasing Internet access for one’s laptop entitles you to log into the Gogo system from your smartphone at no additional charge.  Smartphone support was recently introduced by Aircell, the company that runs the Gogo network and it only took a few moments to point the BlackBerry Bold to the Gogo hotspot and log in.  I was surprised – but pleased – to find out that I was able to use BlackBerry Messenger from the Bold although I could not place or receive phone calls or send text messages.  BlackBerry mail worked as well as did multiple applications I use regularly on the device.

Current position at 13:32 EDT

Current position at 13:32 EDT

By the time I had interupted multiple people via BlackBerry messenger, the flight attendants were handing out hot towels and tablecloths and starting to serve lunch (I had the herbed shrimp with couscous).   During lunch, I reconnected to the Net via the ThinkPad and, using Slingbox, watched CNN and channel surfed.  The picture quality was surprising good and audio quality was perfect.

After lunch, I checked in with a few colleagues via Lotus Sametime and read a few e-mail messages.

This is a working flight so I need to prepare a talk I’m giving tomorrow but I will continue this post later.


We just crossed the border from Nevada to California and I have been able to spend most of my time working, although connectivity was really only “required” sporadically.  I did get to finish an important document and e-mail it to where it was needed.  Absent Gogo, I could not have done that until we landed.  I know the recipient was waiting for it so having connectivity proved very beneficial.

In sum: is it an absolute requirement? Of course not, we’ve gotten along without in-flight Internet access since the Wright brothers. It was fun, however.

Lowering Your Information Overload Exposure

Thursday, April 23rd, 2009 by David Goldes

Information overload describes an excess of information that results in the loss of ability to make decisions, process information, and prioritize tasks.  The Basex Information Overload Exposure Assessment service provides companies with actual measurement of the problem and specific recommendations to lessen its impact.

Organizations of all shapes and sizes have already been significantly impacted by information overload, a problem that costs the U.S. economy $900 billion per year in lower productivity and throttled innovation.  Over the past year, we’ve been working with companies to reduce information overload in their organizations and they have seen almost immediate benefits in increased efficiencies, cost reductions, and improved information flow.

Basex helped one client, a global leader in communications, recapture approximately 12 percent of time lost due to information overload, valued at an estimated $2.8 million per annum. For another client, Basex helped reduce the company’s exposure by 18 percent with a commensurate increase in employee efficiency and effectiveness, translating into an annualized savings of several million dollars.

This week, we are introducing the Basex Information Overload Exposure Assessment, a service that helps companies pinpoint opportunities to reduce the amount of information overload and simultaneously reduce costs within their organizations.  In a time of less is more, the ability to recapture what would otherwise be millions of lost dollars may mean the difference between profit and loss.

Find out more at our new Information Overload microsite.  An Information Overload Self-Assessment Tool is available to help you take the first steps in analyzing your organization’s exposure.

David M. Goldes is the president of Basex.

PC: Meet the Home Motor

Wednesday, April 22nd, 2009 by Jonathan Spira

The computer no longer conjures up an image of glass walled-rooms filled with blinking lights and scientists in white lab coats.  More likely, the image is that of a desktop computer, which has become the prevalent interface to the computing world since 1981, when IBM launched the first IBM PC.

Similar to the mainframes in those glass-walled rooms, desktops were largely immobile; they stayed put on the desk.  Portable computers, which trace their history back to early 1981, when the Osborne 1, weighing in at 11 kilograms, was introduced, were hardly portable and were very expensive.  [The first notebook-style computer was the NEC Ultralite of 1989 weighing a little over 2 kilograms.]  But desktop computers had one advantage: they were cheap and powerful compared to laptop computers.  That pricing disparity meant far more desktops than laptops were sold in the ensuing years.

Still, when we need to use a computer for computing (whatever that means), we typically go to it in order to use and benefit from it.  What we do with it has changed as well; most computers are used to process words and text and manipulate images while the earliest mainframe computers performed calculations for researchers and statisticians.  Computers are also used to play games, listen to the radio, watch television programming and  movies, although these tasks are slowly being offloaded to purpose-built devices that typically embed a microcomputer, some storage, and WLAN radio within.

In my book Managing the Knowledge Workforce, I wrote of the Sears Home Motor, a popular home gadget ca. 1900.  The home motor powered anything that might need turning, such as a mixer, churner, beater, fan, buffer, or grinder.  Typically a household had one home motor and plugged the appropriate attachment into it.  Nowadays the home motor is a relic, obsoleted by the ubiquity of motors in devices that require them, such as a fan or mixer.  “Motors-and microprocessors-surround us but we don’t think of them individually” I wrote then.  “As computers become more and more embedded, they, too, will disappear from view.”

I thought of the home motor when I began testing the several netbooks, a new class of computer that is distinguished by its very light weight and very low price.  It may very well be the netbook computer that sets the computer free from the desktop once and for all.   AT&T is calling them “mini laptops” and offering them in several markets at prices starting at $49.95 with the purchase of 3G mobile broadband services.

Last week I spoke with Jeremy Brody, HP’s global business notebook product manager.  In discussing netbooks, he made repeated used the phrase “good enough” to describe the computing experience they and others believe  purchasers are after.  While I understand what he (and others who use the term) may mean, I think that “good enough” implies that one must settle for something less than optimal.  While it is true that the specs of today’s netbooks are far less impressive than almost any $900 laptop on the market, the netbooks are still probably as fast if not faster than a laptop from 18 months ago, which is probably typical of what people already have.  I don’t think users are settling for “good enough”; rather, I think they are slowly but surely changing their habits in terms of where they use computers and for what.

Netbooks are designed for long battery life and fast Web browsing.  As more tools, applications, and data move into the cloud, a Web browser is the portal that users will go through.  Infrequent business travelers who don’t have a laptop might “settle” for a netbook as a companion device to their desktop computer.  Palm tried to invent the companion notebook market two years ago, announcing the Foleo in June 2007 and cancelling the project three months later (see http://www.basexblog.com/2007/09/07/foleo-ii/)   But the Foleo was somewhat flawed from the beginning, lacking storage and the ability to run standard desktop productivity applications, among other reasons.  In contrast, many netbooks come with Windows XP or even Vista, allowing almost anything that can run on a standard PC to run in this environment.

Companies such as HP are even making “business class” netbooks, which abandon plastic for metal and further blur the distinction.  I’ve been using (on a regular basis) a Lenovo ThinkPad X300 that, while not a mini, is close in weight at only 1.3 kg.  Despite its relatively small screen (13.3″), I have found that I am much more likely to take this computer with me on short trips (even those of just a few hours’ duration) and I am also able to, on more pleasant days, take the laptop outdoors and work while enjoying a less traditional, non-Dilbertian office environment.

The netbook may really represent an interregnum of sorts between traditional PCs and a new class of devices that may turn out to be a type of personal computing panel that folds up into something no bigger than a standard smartphone.  These PCPs (as I have named them) would use new paper-thin display technology and solid state storage and naturally leverage superfast ubiquitous Net access.  No one has announced anything such as this but this is where I see true personal computing going.

Home motor, anyone?

Jonathan B. Spira is CEO and Chief Analyst at Basex.

In the Briefing Room: Brightidea

Wednesday, April 22nd, 2009 by Cody Burke

Ideas.  Employees, customers, and business partners have lots of them and companies large and small can be overwhelmed by them.  They do, however, need to be managed.  Most companies still try to manage ideas and suggestions the old-fashioned way, perhaps not with a wall-mounted suggestion box but with tools that have not strayed terribly far from this protocol.

Despite the wealth of idea management tools that exist today, some companies still get it wrong, a recent initiative by Starbucks being a prime example.

An effective idea management environment must support ideation and subsequent review.  Both of these must be done in lockstep, with the goal being to enable what amounts to a Massively Parallel Conference (MPC), defined by Basex as a massively scaled meeting that takes place in a computer mediated environment and facilitates many-to-many collaboration leading to many-to-one gathering of information.  In this case, the goal is to generate and refine ideas through large scale participation and brainstorming and then communicate the best ones to decision makers.

Brightidea, an idea and innovation management company addresses these areas in its WebStorm idea management offering.  WebStorm is a solid ideation environment.  Think of it as a browser-based brainstorming session that supports the large scale generation of ideas, increasing the odds of quality ideas being generated and discovered.  Once submitted, ideas are ranked by session participants; the best ideas float up to the top, ensuring that good ideas do not languish on a manager’s overcrowded desk or inbox.

The offering includes collaboration and social networking tools that enable a Community of Reliance to be formed around idea generation initiatives.  A Community of Reliance is formed on an ad hoc basis where members rely upon the participation and input of other members whom they may not actually know or come into direct contact with.  With WebStorm, participants rely on each other to rate and comment on ideas; this is enabled through individual profiles that include social networking functionality.  The ideas an individual has created, as well as the popularity of the ideas, are visible in the profile, allowing management to set up incentives for participation through recognition for the contribution of quality ideas.

Brightidea uses its own product internally to battle what they call “idea overload”.  They had found that their own product managers were being bombarded with and spending significant time responding to suggestions, new feature requests, and other submitted ideas.  Automating through use of its own products steps that ideas, suggestions, and requests take reduced this overload by pulling them out of e-mail and into a better suited system.

For all companies, the creativity and passion of employees and customers can provide an extremely effective and cost efficient source of new thoughts and proposals.  Large-scale brainstorming, à la the MPC concept, could be an excellent model for companies to follow as they seek to generate new ideas, and products such as WebStorm make conducting this sort of event easier and more accessible.  Companies that are considering tapping the potential of large scale brainstorming should give thought as to how they can leverage an idea management solution to automate the workflow of ideas, provide incentives, and enable social and collaboration capabilities.

[For an in-depth look at the topic of idea and innovation management, you can also read our report Improving Profits Through Idea Management: How America's Smartest Companies Embrace Innovation]

Cody Burke is a senior analyst at Basex.

Disruption in the Tech Sector: Oracle to Acquire Sun for $7.4 billion

Monday, April 20th, 2009 by Jonathan Spira

Oracle announced it will acquire Sun, a rival IT firm, for $9.50 per share, or ca. $7.4 billion, or $5.6 billion net of Sun’s cash and debt.  The announcement comes two weeks after IBM ended talks to acquire Sun.

In acquiring Sun, Oracle is upending the IT industry.  Oracle has long-standing partnerships with Sun competitors HP and Dell, as well as with Sun, to provide servers on which to run Oracle databases.  The move puts Oracle into the hardware business, putting the company in even more direct competition with IBM, which sells its own servers in conjunction with its database tools and applications, as well as with HP and Dell.

In making the announcement, Oracle CEO Larry Ellison reminded IBM that Oracle will be “the only company that can engineer an integrated system – from applications to disk…”  IBM sold its disk drive business to Hitachi in 2002.

Oracle will gain ownership of two key Sun software assets: Java and Solaris.  The former, given’s Sun’s use of Java for its Fusion Middleware business,  may very well be Oracle’s most significant software acquisition the company has made (in recent years, Oracle has acquired BEA, PeopleSoft, and Siebel).  Sun clearly didn’t want these assets to fall into the hands of a competitor.  Oracle will also be able to optimize the Oracle database to leverage unique features of the Solaris operating system, although the company took pains to state that it is “as committed as ever” to Linux and other open platforms.

While Sun might have represented a difficult entity for IBM to swallow whole, given the vast differences in corporate culture, the Oracle-Sun combination may be somewhat smoother.  Scott McNealy, Sun’s co-founder and chairman, and Ellison have been close allies and both have engaged in repeated Microsoft baiting over the years.

Sun is promising that the acquisition will add $1.5 billion in operating profit in the first year (the acquisition is expected to close this summer).  Very few acquisitions work as advertised (think Time Warner and AOL) and Sun lost almost $2 billion in the last two quarters.  Most customers of both companies, however, will likely cheer as the acquisition removes much (but not all) of the uncertainty in terms of product direction and support.

Jonathan B. Spira is the CEO and Chief Analyst at Basex.

The Lowly Web Browser: Mission-Critical Enterprise Tool?

Thursday, April 16th, 2009 by Jonathan Spira

Until recently, the Web browser was mostly not considered a to be a critical element in the enterprise arsenal.  The Web itself was, to most people, a source of entertainment and news, not serious business applications.  Today, that picture has changed dramatically.  Web browsers are used for all kinds of mission-critical knowledge sharing and collaboration applications, from search to online meetings, from customer relationship administration to content management.

In addition, knowledge workers typically customize their preferred browsers (most use Internet Explorer but an increasing number use Firefox, including myself) for their needs.  In my case, upon booting, I open Firefox with 24 tabs on a dedicated display.  Most windows contain information sources (Google News pages for various countries, the New York Times, the Wall Street Journal, among others), a few open social networking tools (e.g. LinkedIn, Facebook), and others provide a window onto internal data.

But what if my browser no longer worked?  That actually happened to me last Thursday.

It all started after Firefox updated itself to 3.0.8 (reminder to self: turn off automatic update), something I didn’t know it had done at the time.  All of a sudden, my browser, with its 24 open tabs, just vanished.  At first, I thought it was a fluke so I restarted it.  It crashed.  I rebooted. The problem persisted.

I then noticed that Firefox had updated itself to 3.0.8 that day so I searched and found that there were numerous reports of spontaneous crashes.  I downgraded to 3.0.7 and the frequency of the crashes diminished greatly:; instead of a crash within 2 minutes or so of starting the browser, it would crash only after 20 or 30 minutes.  Clearly an improvement but not a long-term solution.

Our help desk told me that I had done everything they would have done short of reformatting my hard drive (why is this always the proffered solution?) or switching to IE (which, unless something has changed, won’t launch with 24 open tabs).

Suddenly it was déjà vu all over again as I recalled why I had upgraded to the beta version of Firefox 3.0 well over a year ago.  At that time, Firefox 2.2 had stopped cooperating.  Moving to the next version solved the problem.

Fortunately, I also recalled that there were reasonably stable beta versions of Firefox 3.1 available.  And lo and behold, Firefox 3.1 beta 3 was the answer and my 24-window browser has been crash free since last Friday.

This does, however, bring up a larger issue.  As more users rely on browser-based services for business use, the stability of browsers becomes increasingly important.  Organizations need to review browser usage with an eye towards support and backup so that any unexpected failures (such as the one I experienced) do not result in unacceptable downtime.

Jonathan B. Spira is CEO and Chief Analyst at Basex.

EBay Hangs Up on Skype… But Who Will Call Next?

Wednesday, April 15th, 2009 by David Goldes

EBay purchased Skype in 2005 because it had the resources to do so.  At the time the deal was announced, Jonathan Spira wrote that “the purchase of Skype serves notice to the telecommunications industry that voice is merely another service delivered in a data setting.”

Since then, Skype has indeed become a telecommunications giant, albeit one that didn’t seem to have any of the promised synergies with its new parent.  This week, after months of speculation on the company’s future, eBay announced that it will sell Skype via an IPO in 2010.

Before that happens, Skype will have to resolve an intellectual property dispute with Skype founds Niklas Zennström and Janus Friis.  Joltid, a company they founded, retained ownership of the peer-to-peer technology Skype uses and this was licensed back to eBay.  Recently, Joltid said that eBay was in breach of their agreement and eBay has asked a U.K. court to intercede.

The 2010 date gives eBay lots of time to continue to shop the company.  Negotiations with Skype founds Niklas Zennström and Janus Friis reportedly fell through but, absent the founders’ involvement, does it still make sense for Skype to operate as an independent company?  After all, how extensible – or profitable -  is Skype’s most-used feature: free calls to other Skype users.  The company has over 400 million of them (the figure was 405 million at the close of 2008) and revenue for the year was up an impressive 44%.   In addition, Skype is first starting to explore the business market – and that market is willing to pay for certain services.

The list of potential buyers most frequently mentioned is noticeable for an absence of telecommunications companies such as Deutsche Telekom, AT&T, Verizon, and BT.  Any one of these could build an instant bridge to the future of telephony by acquiring the company.  We’ll find out which telecoms company has a true vision for the future when we see an announcement of Skype’s sale in the next three to six months.

David M. Goldes is the president of Basex.

IBM-Sun Deal Collapses (but should it have ever gotten this far?)

Sunday, April 5th, 2009 by Jonathan Spira

IBM withdrew its $7 billion offer for Sun Microsystems today, putting an end to several months of exclusive talks between the two companies.  Sun, a company that was a pioneer and innovator in high-end workstations and servers that, as the company’s tagline once put it, “put the dot in dot-com,” has struggled in recent years although it has retained a valuable customer base, a treasure trove of intellectual property, and a vaunted research and development staff.

IBM’s decision to withdraw its offer, which dropped from $9.55 to $9.40 per share on Saturday, may be a negotiating tactic and the two sides could theoretically resume discussions.  IBM has not spoken publicly about the possible acquisition but the move would have led to a significant consolidation in the highly-competitive server and data center market.  IBM would end up way ahead of close competitor Hewlett-Packard and would gain entry into other key markets where it has little presence, including the growing storage market, now dominated by EMC and Network Appliance, as an added bonus.

While IBM would clearly benefit from the acquisition (the company has been weathering the economic downturn just fine having seen increased profits despite a 6% decrease in revenue), Sun, on the other hand, actually needs a deal to survive in some fashion.  It’s lost almost $2 billion in the last two quarters and has laid off some 2,800 workers this year as part of a cost-cutting exercise.  Sun has been trying to convince itself, in many respects, that it was a software firm (it fancied itself going head-to-head with Microsoft after acquiring Open Office in 1999) but hardware has always been at the core of its business – and its earnings.  If IBM and Sun don’t return to the table, others may come calling.  Cisco recently entered the server market and would benefit from an instant installed base.  Sun’s customers, largely in government, financial services, and telecoms companies, are a loyal bunch and that loyalty is what has kept Sun intact to date.  Sun’s other businesses, including the Solaris operating system, Web infrastructure software, Java, MySQL, and NFS, would be icing on the cake although an acquirer might even sell or spin some of these off.

In addition, Sun would be a difficult entity for IBM to swallow considering the vast differences in corporate culture (laid-back Sun versus buttoned-down IBM) and IBM’s competitors wouldn’t hesitate to leverage any hiccups to their advantage.  Very few mergers work out as advertised; the promised efficiencies hardly ever seem to materialize (think DaimlerChrysler for a textbook example); and customers, sensing trouble ahead, look elsewhere.

The Wall Street Journal broke the news of the merger last month.  At the close of trading on Friday, Suns shares were at $8.49.

Jonathan B. Spira is the CEO and Chief Analyst at Basex.

In the Briefing Room: Virtual PBX iVPBX

Thursday, April 2nd, 2009 by Jonathan Spira and Cody Burke
1947 Stromberg Carlson PBX

1947 Stromberg Carlson PBX

IP-based telephony for smaller organizations is starting to get interesting again.  Last week, Skype introduced Skype For SIP and now, Virtual PBX is launching the iVPBX, a fully featured hosted PBX offering that supports Voice-over-IP (VoIP) softphones and SIP-compliant desk phones. [SIP is the prevalent open standard for business telephony networks and supports “sessions” in an IP network.] Built on the open systems platform that Virtual PBX introduced in November of last year, the iVPBX routes calls over the Internet to extensions using Gizmo5 VoIP phones as an alternative to using more expensive landline technology for the call.

Hosted PBX systems are the twenty-first century equivalent of Centrex, a PBX-like service developed in the mid-1960s where the switching took place in the telephone company’s central office; this contrasts with a PBX system, where the equipment is on site.  While Centrex was ideal for larger organizations that occupied multiple buildings or a campus, hosted or virtualized PBX services are ideal for small businesses, especially those where employees are found in many different locations.

The new iVPBX separates itself from more traditional hosted PBX systems in several ways, including pricing.  Traditional systems are generally priced based on a monthly allowance of free minutes, with a per-minute charge that kicks in when the free minutes are exhausted.  The iVPBX differs in that it allows unlimited inbound calling and does not have a per-minute fee; instead it relies on a per-seat pricing plan, around ten dollars per extension.

The iVPBX is being launched as a joint offering with Gizmo5, a VoIP provider, although it will work with any solution that is fully SIP compliant and uses North America Numbering Plan (NANP) phone numbering for destination identification.

The offering is good news for existing Gizmo5 users as well as Virtual PBX customers.  For the former, the iVPBX offering adds PBX functionality such as call transfer, ACD queues, and automated attendant.  For the latter, new functionality will include call recording, instant messaging, and file sharing.

Both companies will be cross-marketing and selling the combined services, although, according to Greg Brashier, COO of Virtual PBX, the company will be looking to work with other VoIP providers who are also fully SIP compliant.

For Virtual PBX, similar to the quandary faced by Skype, the challenge will be how to communicate benefits to customers without the requisite flurry of acronyms.  For many, the low price point will suffice, but the enhanced functionality that users get from the iVPBX makes it worthy of consideration even for those who are not terribly budget conscious.

Jonathan B. Spira is CEO and Chief Analyst at Basex. Cody Burke is a senior analyst at Basex.