Tuesday, May 6, 2008

Do the Old Analysis Rules Still Apply?

I was going through some old papers the other day, and I found the presentation I used to give to newbies at Aberdeen Group on how to see past marketing hype – or, as we called it then, the Bull#*! Detector. One of the best ways, I suggested, was to “Drag Out Some Old Rules” about what would and would not succeed in the marketplace. Looking at that presentation today, I was curious as to how well those Old Rules held up, 11-12 years later.

Here are the three rules I wrote down:

Emulation rarely works. By that I meant, suppose someone came to you and said, “I have a great new platform designed to run Linux. And, if you need Windows, it emulates Windows, too; you’ll never tell the difference.” But you would be able to tell the difference: fewer Windows apps supported, slower performance, bunches of additional bugs, and slowness to adopt the latest Windows rev. And that difference, although you might not be able to articulate it, would lead the bulk of users to choose a Wintel platform for Windows over the Linux one.

I’m no longer sure that this rule will hold up; but it may. The key test case is businesses allowing employees to use Macs at work. An article in last week’s Business Week suggested that there was a big trend towards using the Mac at work, and large businesses were beginning to allow it, as long as they could run their Windows apps on those Macs. But if this rule holds up, the bulk of large businesses are never going to switch to the Mac; neither IT nor the employees will accept a solution that’s more of a pain in the neck, no matter what Microsoft does with Vista.

Open and standards means slower development of new products and less performance; proprietary means lock-in. The logic to this one was simple: if you’re going to be open, one of your main foci is adhering to standards; and standards tend to lag the technology, while preventing your programmers from fine-tuning to achieve the last ounce of performance (I don’t think anyone argues with the idea that proprietary fine-tuned products lock users into a vendor).

I don’t think this one has been disproved so much as it has become irrelevant. Most if not all software created today is mostly or all open, and in many cases not open source. And there are so many other variables going into software development and software performance today, like the effective use and reuse of components and the use of REST instead of SOAP, that the inability to fine-tune instead of using standardized code is at best a rounding error.

If it’s Wintel it’s a de-facto standard; if it’s anything else it’s not. To put it another way, Microsoft was the only vendor that had the power to produce software that was immediately broadly accepted in the market, and used as the basis for other apps (like Office, SOAP, and Windows).

I would argue that, despite appearances, this rule still holds to some extent. Sun had a big success with Java, and IBM with Eclipse; but those were not imposed on the market, but rather made available to a rapidly-growing non-Microsoft community that chose to accept them rather than other non-Microsoft alternatives. Meanwhile, despite all we might say about Vista, a massive community of programmers and lots of end users still look first to Microsoft rather than the Linux/Java/open source community for guidance and solutions.

Here’s a couple of Rules that I didn’t put down because it was politically incorrect at Aberdeen back then:

Automated speech recognition/language translation/artificial intelligence will never really take off. Despite the wonderful work being done then, and still being done, in these areas, I felt that the underlying problem to be solved was “too hard.” To put it another way, one of the great insights of computer science has been that some classes of problems are intrinsically complex, and no amount of brilliance or hard work is going to make them good enough for what the market would really like to use them for. The above three, I thought, fall into that category.

I note that periodically, Apple or IBM or someone will announce with great fanfare speech recognition capabilities in their software as a key differentiator. After a while, this marketing pitch fades into the background. Naturally Speaking has not taken over cell phones, most language translation at Web sites is not great, and there is not widespread use of robots able to reason like humans. I believe this Rule has held up just fine, and I might add “pen computing” to the list. Jury’s still out as to whether electronic books will fall in this category, although Paul Krugman is now giving Kindle an excellent review in his blog.

Network computing, under whatever guise, will always be a niche. Many folks (like, I think, Larry Ellison) still believe that a dumbed-down PC that has its tasks taken over by a central server or by the Web will crowd out those pesky PCs. My logic, then and now, is that a network computer was always going to be, price-wise, between a dumb terminal (read: a cell phone) and the PC – and products that are midways in price and not dominant in the market will inevitably get squeezed from both sides.

Now, I know that many IT shops would love to go to network computing because of the PC’s security vulnerabilities, and many Web vendors hope that most end users will eventually trust all of their apps and data to the Internet exclusively, rather than having Web and PC copies. I say, ain’t gonna happen soon. This Rule still seems to be holding up just fine.

Finally, here’s a couple of Rules I have thought up since then:

Open-sourcing a product means it never goes away, but never dominates a market. That is, open-sourcing something is an excellent defensive move; but it closes off any offensive that will take over that market. Thus, open-sourcing, say, Netscape Communicator or CA-Ingres has meant that a loyal following will keep it fresh and growing, and you’ll attract new folks like the open-source community. However, it has also meant that others now view them as not offering key differentiable technology that would be worth a look for use in new apps.

The test case for this one today is MySQL. Over the last 4 years, MySQL has probably amassed more than 10 million users, but those are mostly new users in the open-source/Linux community, not converts from big Oracle or IBM shops. Look at Oracle’s revenues from Oracle Database in the last few years, and you’ll see that open-source databases have not had the impact on revenues nor on pricing that has been expected. My Rule says that will continue to be so.

And here’s a Rule that’s for the Ages:

There’ll always be Some Rules for analysts to Drag Out to detect marketing hype. From my observation, all of us are guilty at one time or other of over-enthusiasm for a technology. But I would say that the good long-term analyst is one who can occasionally use his or her Rules to buck the common wisdom – correctly.

No comments: