Wednesday, January 28, 2009

Virtualization requires a focus on storage

According to a recent survey conducted with 93 respondents, found that 24% of respondents expect at least limited use of Desktop Virtualization in 2009, while another third plan to evaluate the technology this year. (About 40% reported having no plans to explore or use desktop virtualization this year.) So, it's safe to say that desktop virtualization is on the corporate radar.

If your firm is considering implementing a virtual desktop strategy, be sure to include a review of how data is stored and accessed by your users. Storage is the most overlooked cost of a virtual desktop strategy and could be the biggest cost delta in moving to a desktop virtualization environment. The cost delta can be mitigated by several factors. For instance, if your user policies currently redirect personal files to a network location, then virtualizing desktops will not add much of a burden. However, if personal files are stored locally, then additional storage needs will be significant.

If an investment in additional storage capacity is required, then that will turn out to be just the beginning of the story.

  • File types - You probably aren't too concerned about users storing music locally on their desktop, but that will need to be re-examined when those files are using up more expensive SAN space.
  • Data Center - The extra storage capacity will need to go somewhere, and it will need power. Is there room in your data center for the storage devices? Is there enough electricity to keep them running? Does your HVAC system have the capacity to handle the additional heat sources?
  • Internal Bandwidth - Before the virtual desktop, your users were accessing their massive Excel spreadsheets and Access databases locally. Once that desktop is virtualized, they will be pulling all of that data across your network, from the storage device to the central desktop server farm. This is not biggest hurdle, but it's not trivial by any means.

Please contact Roig Consulting for an objective assessment of these issues before getting too far down the virtualization road. We can help you plan and prepare for all of the implications of your virtualization strategy.

Google expands its email offerings

Google has announced the test-release of an important email feature: Off-line access. This may not seem like such a big deal to the average corporate user. After all, we've been using email on the plane for years now, sync-ing up when we get to a network connection. But G-Mail is different. There is nothing on your PC... it's all on the internet. This is the foundation of cloud computing and this release represents a radical strategic exception for Google, which has stake its future on cloud computing.

The bottom line is that Google needed to do this if wants to be a serious player in the corporate market. Consumers will tolerate not being able to read their email when there's no network, but business users will not. "This is a feature we've heard loud and clear the enterprise wants," said Todd Jackson, Gmail's product manager.

But will it be enough to move large corporate customers off of the industry standard, Microsoft Exchange? Will it even be enough to move small companies away from other open source, but more traditionally architected solutions? I'm not so sure.

Email is the mission critical application of modern business. You can survive having your A/P system down for a day. But take away email and you'll see everyone from the CEO to the receptionist gathering torches and pitchforks on their way to the data center. There are very good reasons why Exchange has earned its place as the market leader:

  • It's highly reliable
  • Tolerant of administrator and user mistakes, and
  • Very straightforward when it comes to recovery.

Who in their right mind would want to mess with that? This is the hill that Google is trying to climb. And, given the current climate, I suspect that very few decision makers will risk making that kind of change.

For more information: A blog post on CNet

Friday, January 23, 2009

Verizon pushes cell - landline convergence

Next month, Verizon Wireless plans to release a new, land-based phone that claims to integrate seamlessly with their cellular network. The phone is VoIP device which uses the customer's existing internet service for access. While they say they don't intend to compete with Vonage and Comcast internet phone service, the package is aimed squarely at the market segment that would consider switching from traditional phone service to one of those products.

The package consists of a very expensive phone ($199, after rebate) and a monthly service plan ($34.99). "Phone" is probably a misnomer, though... at least as much as "cell phone" is a misnomer for the BlackBerry that's strapped to my waist. The device incorporates a 7" screen which allows the user to access a limited number of web-based applications, such as news, sports, and traffic information. It's also where users can manage their calendars and send and receive text messages from Verizon Wireless phones.Take a look at this picture:

And this is the primary advantage that the device offers over Vonage and Comcast. I can use a softphone client on my PC over my existing Comcast internet access, thereby combining all the advantages of a phone and a PC. But it requires that I fire up my machine whenever I want to make a call. The Verizon device is "always on" and always connected. It really becomes a much more convenient PC that is relatively unobtrusive. It's a PC you can fit in your kitchen, for instance.

There are distinct disadvantages with respect to Vonage & Comcast. Cost being the most prominent. But I suspect that's a temporary state of affairs. I would bet on Dell or Apple coming out with a competing device that's, at least in the case of Dell, more favorably priced. Keep an eye on this. It should be very interesting.

Read more at C-Net: Verizon Wireless launches new product

Thursday, January 22, 2009

Google announces a profitable fourth quarter

Somehow, someway, Google was able to record a reasonable profit in 4Q2008. According to their press release:
“Google performed well in the fourth quarter, despite an increasingly difficult economic environment. Search query growth was strong, revenues were up in most verticals, and we successfully contained costs,” said Eric Schmidt, CEO of Google. “It's unclear how long the global downturn will last, but our focus remains on the long term, and we'll continue to invest in Google's core search and ads business as well as in strategic growth areas such as display, mobile, and enterprise.”
Other highlights of the announcement include:
  • Total Revenue of $5.70 billion in the fourth quarter of 2008... an 18% increase over fourth quarter 2007 revenues.
  • Operating income was $1.86 billion, or 33% of revenues, in the quarter.
  • Net cash provided by operating activities for the fourth quarter of 2008 totaled $2.12 billion.

Today, the stock closed at $306.50, up $3.42, for a gain of 1.13%.

Even as the world economy struggled in the latter half of 2008, Google managed to post improvements over 2007. I know of no other company that was able to manage that feat. Their productive is no longer seen as a discretionary item. They are indispensable to businesses and consumers... at least for now. And that is the secret to success in any market!

Read the Press Release from Google.
Google Inc.: GOOG (NASDAQ)

Citrix attempts to expand virtualization

The other day, Citrix announced that they are developing a desktop virtualization platform tuned for Intel Core 2 desktops and Centrino 2 laptops with Intel vPro technology. The company believes that this will expand its market footprint by combining central application control with portability and personalization. The new platform aims to enable IT professionals to dynamically stream a centrally managed corporate desktop and applications onto a secure, isolated client-based virtual machine. The trick, according to Citrix, is the ability to cache and execute desktop and application software directly on the PC client.

In English, that means that IT departments will be able to deploy the most commonly used business applications (Outlook, Excel, Word) to users who aren't constantly connected to the network (traveling executives). This has, to date, been the insurmountable obstacle to near universal adoption of virtualization strategies in any business. The C-Level team will support you as you use virtualization to save money ... right up to the point where you tell them they can't check email on the plane. They may wait until you leave the room to start laughing... or they may not. You can never predict how those conversations will really play out.

In any event, if Citrix has been able to pull this off, and if they don't price themselves out of the market, this could truly represent an inflection point in virtualization adoption. On the other hand, it could be just another in the long line of cool technology solutions in search of a real-world problem.

Citrix Working On Desktop Virtualization for Intel Devices (from CRM Daily)

Tuesday, January 20, 2009

Some big names won't be around in 2010

A recent survey by CIO Insight identified twelve companies that will either fold or be acquired in 2009. 200 participants were asked to indicate which companies, out of a choice of more than 20, would encounter one of those fates this year. The results are in the graph below.

A few surprises for me.
  • - It's hard for me to imagine that 19% of respondents picked them to go away this year. Their growth has been fairly steady, and at times spectacular. And I don't see Oracle or Microsoft wanting to pick acquire a product that competes with their in-house apps. Their alliance with Google bears watching, though.
  • AMD - As far I can tell, there is no other viable competitor to Intel. And the market is too big for just one player.
  • CA - This company, formerly known as Computer Associates, isn't going anywhere. Their strategy has been growth through acquisition for nearly two decades, and that kind of culture makes them a very unlikely target.

The other big names on the list are, in my opinion, very much in play for this year. Few will go out of business, but at least one or two will end up in a deal. From this list, I'd expect Citrix, Juniper and Checkpoint to be the most likely acquisition targets. Novell is part of Microsoft's Linux Defense strategy, but it provides no other real benefit to society. Expect that product to be even less relevant a year from now.

CIO Insight survey results: Dire Predictions
Survey Methodology: How the survey was conducted

Virtualization requires security focus

As virtualization takes on greater significance in the data center, it's critical that your security measures keep up with the changes. The security challenges presented by virtualization are different from a traditional environment in several ways. For instance, if network traffic no longer needs to be transported through a switch, then monitoring the switch for suspicious activity will not be adequate.

IT leaders have been slow to recognize the need for solutions designed for a virtualized architecture. According to Nemertes Research, only 9.6% of participants in their recent Virtualization benchmark are currently deploying third-party tools focused on security in a virtualized environment. Since internal threats account for more than 15% of reported data breaches (see last week's post - Data Breaches on the Rise in 2008), this represents a significant gap in security execution. Why? Because internal staff are far more likely to discover the details of your data center architecture, and therefore are in a unique position to exploit any deficiencies.

Now is the time to review your security strategy. Contact Roig Consulting for a complimentary consultation.

Nemertes Impact Analysis

Monday, January 19, 2009

Is Intel cutting prices?

Last week, Barron's reported that Intel is planning significant price cuts across multiple chip lines in the coming months. The new pricing is expected to affect their quad-core and double-core server chips, including the Xeon line. The impact could be as much as a 40% for some models.

Barron's source is an industry analyst named Michael McConnell, and I haven't seen any confirmation or denial from Intel. So it will be interesting to see how this plays out over the coming months. AMD is increasingly aggressive in the server market, which I believe to be the big growth industry for the next 5-10 years.

Tech Trader Daily from Barron's - Intel reportedly cutting prices
Intel Corporation: INTC (NASDAQ)
Advanced Micro Devices, Inc.: AMD (NYSE)

Friday, January 16, 2009

USB finally gets an upgrade

At last week's Consumer Electronics Show (CES) in Las Vegas, Intel debuted the next version of USB. USB 2.0 was introduced in April 2000, and since then, thousands of devices have used the protocol to established wired connectivity. Everything from cameras to PDAs to external storage devices are dependent on the USB 2.0 standard.

USB 3.0, however, promises much better performance. The theoretical limit for data transfer over USB 2.0 connection is 480Mbps. Intel claims that USB 3.0 will move data at an astounding 5Gbps ... or a more than fivefold increase. Other standards have emerged since USB 2.0, like eSata and Firewire. Both are far superior to USB 2.0, running at 3Gbps (eSata) and 800Mbps (Firewire - full duplex). Yet neither will be able to measure to USB 3.0.

The technology is to be production ready by early 2010, however don't expect to see widespread use until 2011 or 2012. The devices that use the protocol will have to adapt, and PC components will need to be developed. There are physical differences in the connection points, and device drivers need to be created and tested.

So there is a lot to do before the bulk of the consumer market can take advantage of this advance. Yet, it's encouraging that Intel has begun tackling this challenge. Portable devices are able to store huge amounts of data, and USB is still the practical way of putting it there. Yet, lots of data means lots of time spent waiting for the files to transfer. USB 3.0 is an important step in the right direction.

Intel Corporation: INTC
Photo Credit: Reuben Lee/CNET Asia
More Information: Cutting Edge blog post @ CNet

Thursday, January 15, 2009

Data breaches on the rise in 2008

The Identify Theft Resource Center (ITRC) published a report last week on data breaches in 2008. According to the report, published January 6, 2009, reports of data breaches are up significantly over 2007. Of course, this doesn't necessarily mean that there are more data breaches than before. Reports of breaches are bound to increase due to heightened awareness of the issues, laws and regulations enacted specifically to address the issue, and public pressure. Experts believe that the increase in the number of reported data breaches can be traced to these factors, as well as the likelihood that criminal activity in this area is, in fact, on the rise.

The information is sobering. For example, "only 2.4% of all breaches had encryption or other strong protection methods in use. Only 8.5% of reported breaches had password protection." This means that the vast majority of reported data breaches were of unprotected information. It is akin to leaving your car running and unattended in the grocery store parking lot. It's just too easy.

Another troubling piece of data is that nearly 16% of the breaches were traced back to malicious internal behavior (please see the nearby table). I believe this means that the work that IT leaders have done to protect their data against external threats has likely been reasonably successful. However, it also means that internal control is too weak. Companies need to invest in ensuring that employees have access to only the data they need to be successful in their work. Not all of will be accomplished by technology and at some point you will simply have to trust your people. But clearly there are improvements available.

Finally, the report tells me that institutions are not valuing their data. An element of the report indicates how much data was actually exposed. In the financial sector alone, over 18 million records were exposed in 2008. Of those, over 750,000 records exposed included password information. For a bit of perspective, that's more people than live in North Dakota.

The fallout from appearing in a report like this can be devastating. There are legal penalties and the potential for civil action, not to mention the damage done to a company's brand. Now is the time to implement firm data access guidelines. Roig Consulting can evaluate your needs and help you develop appropriate policies for your firm.

The ITRC report: 2008 Data Breaches Report
Additional Detailed Data: 2008 Data Breach Statistics
ITRC Home Page: The Identity Theft Resource Center

Wednesday, January 14, 2009

Nortel Networks files for Bankruptcy

In a long-anticipated move, Nortel Networks announced that it is filing for bankruptcy protection in the Ontario Superior Court of Justice. The Canadian firm had posted a $3.4 Billion loss in the third quarter, and had been under investigation by the Securities and Exchange Commission for number of accounting irregularities. The company's American and European subsidiaries are also expected to file for bankruptcy protection.

It's been evident for the last 3 or 4 years that Nortel's core product line - the switch-based private phone system (PBX) - has been facing its eventual demise. Voice over IP (VoIP) has proven itself to be far more cost-effective over the long term, and high quality vendors like Cisco and Avaya have taken over leadership of the telecommunications industry. Whether Nortel can emerge from bankruptcy or not is obviously an open question. But I have my doubts. Phone systems are long-term investments, and a buyer wants to be reasonably sure that the manufacturer will be around to stand behind their products. Filing for bankruptcy will, of course, cast substantial doubt on that part of the equation. As a result, sales to new customers will dry up almost instantaneously, and sales to existing customers will be hit with enormous competitive pressure.

It's really too bad, because Nortel made excellent switch-based PBX systems. They were highly reliable and relatively easy to maintain. Plus, they had cultivated great relationships with their reseller community, making top sales engineers readily available for proposals and site surveys. I wouldn't too surprised to see Cisco snatch up a significant part of the business, using that strategy to convert more customers to their VoIP product line.

Cisco Systems, Inc.: CSCO (NASDAQ)
Avaya Inc.: Privately Held
Nortel Networks Corporation: NT (TSE)

Tuesday, January 13, 2009

New wireless access point technology from Cisco debuts

Cisco is in the process of releasing a new wireless access point for commercial applications. The company claims that the new device, called the Aironet 1140 Series Access Point, will not only enable high-bandwidth applications across a wireless connections, but it will improve the throughput of existing wireless access points.

The press release is naturally short on details. However, if the claims are to be believed, then this technology should help companies make better use of their existing and new office spaces. It runs on the 802.11n protocol, which is more effective than previous versions (802.11a/g). So as a firm needs to make additional work areas available for staff, they might not have a requirement to run cat-5 wires.

I would recommend allowing this technology to mature for about 6 months before deploying it for mission critical applications -- like voice. However, I believe it will be ready for widespread adoption very quickly. Even older technologies, like 802.11g, are viable for voice in limited applications (2-3 users). So, it's not a huge leap to expect this device, built by a proven industry leader, to live up to what I deem to be fairly reasonable claims.

Please contact Roig Consulting for information on how to take advantage of this important technology.

The press release as published by InformationWeek:
Cisco Rolls Out 802.11n Access Point
Cisco Systems, Inc.: CSCO (NASDAQ)

Monday, January 12, 2009

Beta release of Windows 7 gets going

This past Friday - January 12, 2009- Microsoft (MSFT) released the beta version of its next operating system. Originally, the company had placed an upper limit of 2.5 million downloads of the new system. However, after a few false starts with the release, including overloaded servers, the company decided to remove the cap, allowing unlimited downloads through January 24, 2009.

It's likely that the beta will still be available for download after the download period expires, as has often the case in the past.

More information: Microsoft Ditches Windows Beta Download Limit
More information: Microsoft blog announcement

Friday, January 9, 2009

Maintenance Contracts are the big revenue stream

While researching the technology news for today's posting, I stumbled across this chart. It indicates the percentage of revenue major firms derive from maintenance contracts. It should be a real eye-opener for CIO's, CTO's and CFO's (Credit: Goldman Sachs).

I find it astounding that Computer Associates and BMC get about 60% of their top-line revenue from maintenance agreements. There are two takeaways as far I can see.
  1. If it seems the sales reps are pushing maintenance contracts as if their careers depended on it, then it's probably because they believe their careers actually depend on it.
  2. This is negotiating leverage point. A firm should absolutely be able to use this information to get better deals on licensing fees, as well as on professional services.
On a related topic, isn't time to rethink the whole maintenance agreement?
What are you actually buying? Upgrade assurance? Response Time?
  • If all you need is to be able to get the next version for free, can't that be negotiated into the original agreement?
  • And how many times has your staff called into their customer support requiring immediate attention? Aren't most emergency situations handled by your team? Do you really need to fork over tens of thousands of dollars every year for a service you hardly use, and probably don't need?
I believe that the maintenance fee gravy train is coming to an end. Competitive pressures will force firms like these to rein in their fee structure. It may not happen in 2009, but within five years, we'll see a major shift in this area.

Thursday, January 8, 2009

SAP declares that SaaS will not work

The CEO of SAP, Bill McDermott, predicts that Software as a Service will never be a viable platform for large companies. He cites several common-sense reasons as the basis for this opinion, including the need for control of proprietary information, as well as the challenge of integrating data across multiple platforms.

Stripping away the dramatic use of the word "never", McDermott stakes out a fairly reasonable position on the future of SaaS. The largest global organizations already run their operations in a kind of SaaS environment.

How so?
Think about a centrally located enterprise SAP or Oracle implementation that serves users across multiple locations and P/L centers. The firm doesn't have an individual installation for each business unit; all the business units share a single, large instance of the software. Presumably the costs associated with the software are spread across the business units. Doesn't that sound a lot like a SaaS model? The only difference is that the centralized IT organization typically does not get actual money from the user community the way that does.

So it stands to reason that a SaaS model does not currently make sense for large enterprises.

But, "never" is long, long time. And the economics that justify proprietary investments in enterprise software are certainly subject to change. The SaaS cost-structure does not necessarily work so well for the customers that expect to increase their number of users significantly. However, the in-house installation model does not work so well for firms that expect to decrease their user base in the coming years.

I believe that the Software as a Service is a viable path for small and mid-sized firms. If they grow into large firms, then the ongoing subscription expense will be the impetus for re-examining that strategy. Until then, however, the avoidance of the upfront investment -- in cash -- presents a compelling case for taking advantage of the SaaS offerings.

Oracle Corporate: ORCL (NASDAQ) CRM (NYSE)
InformationWeek Article: SAP CEO: SaaS Won't Work As A Core Platform

Tuesday, January 6, 2009

Software Vulnerabilities Exposed

A recent study lists Firefox, EMC VMWare, Citrix, iTunes and 8 other popular software titles as the most vulnerable applications currently in use. In order to merit this dubious distinction, the softare must various criteria, including:
  • Must run on Windows
  • Well-known to the general computing public
  • Generally regarded as non-malicious by most computer departments
  • Had at least one reported security flaw during 2008
  • Requires the end user to maintain the security - as opposed to central application administration.
This necessarily narrows the list -- somewhat unnaturally -- since most of the Microsoft products in popular use can be regulated via a centralized application management tool, such as SMS or WSUS. The study is focused on a corporate audience, as the publishing enterprise - Bit9 - sells a software management tool that addresses the problems that enabled these titles to make the list. So the results need to be taken with generous skepticism.

Nevertheless, the list makes for interesting reading, as it includes some of the recognizable names in consumer and corporate technology.

2008's Popular Applications with Critical Vulnerabilities

  • Mozilla Firefox
  • Adobe Flash & Acrobat
  • EMC VMware Player,Workstation and other products
  • Sun Java Runtime Environment (JRE)
  • Apple QuickTime, Safari & iTunes
  • Symantec
  • Trend Micro
  • Citrix Products
  • Aurigma, Lycos
  • Skype
  • Yahoo! Assistant
  • Microsoft Windows Live (MSN) Messenger
From a corporate perspective, there are fairly easy ways to protect agains the reported vulnerabilities, even without a tool like Bit9 is peddling. A sensible corporate policy regarding these applications, coupled with a thoughtful desktop image will take of the bulk of the risk.

Download the study from Bit9: The Most Vulnerable Applications—2008 Report

Monday, January 5, 2009

Business Process Improvement tops 2009 CIO Priorities

A recent survey of 220 IT executives (by CIO Insight) showed that 42% of CIOs believe that Improving Business Processes is a top priority for 2009. Surprisingly, Cost Cutting was considered a top priority by only 38% of respondents, while only 25% named Generating more Business as a priority for this year.

The survey results do not provide any details as to when the survey was conducted. Since the survey was published in early December, it's quite probable that the full impact of the economy's 3rd and 4th quarter performance was not accounted for in their responses. At the same time, however, by October, I was already starting to see a pull-back on IT initiatives. Nearly everyone I communicated with was focused on cost reductions and hard savings. And virtually no one was looking at the soft savings promised by many IT initiatives. In fact, even projects with clear and hard-dollar ROI were being scrutinized and frequently delayed.

What explains the difference?
The only thing I can think of is that the respondents expect the economy to begin righting itself sometime during the course of 2009. I haven't seen any evidence to substantiate this belief, but I cannot come up with any other reasonable explanation for these responses.

Another fascinating detail in the responses is in the Cost-cutting results, which I've highlighted below. Notice the disparity between smaller (< $500 million) and larger (> $500 million). Apparently, the larger organizations were ahead of the curve on belt-tightening. The opposite disparity also shows up, to a lesser degree, in the customer service response. Taken together, this tells me that larger organizations are going to be in better position to take advantage of the eventual recovery than their smaller competitors. And we should adjust our strategy accordingly.

Summary of Responses - Top CIO Priorities for 2009:

  • Improve Business Processes - 42% ... (Less then $500M - 43%, More than $500M - 40%)
  • Deliver Better Customer Service - 41% ... (Less then $500M - 44%, More than $500M - 37%)
  • Cutting Costs - 38% ... (Less then $500M - 34%, More than $500M - 45%)
  • Generate More Business - 25% ... (Less then $500M - 28%, More than $500M - 21%)
  • Innovate New Products and Services - 22% ... (Less then $500M - 23%, More than $500M - 19%)

Link to full survey results at CIO Insight: Top CIO Priorities for 2009

Friday, January 2, 2009

Browser choices change dramatically in 2008

Over the past year, surveys indicate that a dramatic shift is underway in the choice of browsers. According to a study published by Net Applications Microsoft's Internet Explorer has lost over 10% of global market share. About two-thirds of those users have changed to Firefox, while most of the rest have moved over to Safari (by Apple). The results are summarized in the chart, showing browsers that garnered at least 0.01% of the market.

The survey results are significant in several ways.
  • First, Firefox now commands -- for the first time in its history -- a 20+% market share. This means that Microsoft's attempt to be the pre-eminent gateway to the Internet has failed. IE's lack of innovation and poor quality has cemented this fact for the firm.
  • Second, Google's new browser, Chrome, went from zero to 0.30% of the market in less than nine months. This speaks volumes about Google's ability to influence the market and stands in stark contrast to Microsoft's failure in this regard.
  • Third, Playstation is on the list. Let me repeat that. Playstation ... a gaming console ... has earned 0.04% of the market in 2008; which is double their market share from the previous year. Whoever said that TV/Internet convergence is dead was premature.
  • Finally, as IE fades, the door will swing just a bit wider for Linux desktops in the corporate environment. Firefox has earned its place as the default browser choice for Linux PC's with excellent quality and innovation over the several years.

Microsoft: MSFT (NASDAQ)