I note in the news that here in the US, a Republican Senator has criticized Republican House members by calling them naïve “hobbits”, and a Republican Representative has riposted by saying he’d rather be a hobbit than a troll. This is just the latest instance of Tolkien’s Lord of the Rings being referenced by US politicians, especially Republicans.
Enough, enough of this. I may be alone in saying this, but I suspect that Tolkien would be sick at heart at what his message has become.
It is clear from his published Letters that Tolkien distrusted Americans – which, at that time (1950s and early 1960s), meant American men. He was, for example, very wary of having his books made into movies by Americans – his stated reason was that the books’ words were meant to be mythical, and didn’t sound right when spoken. As it turns out, the makers of the movie found that in many cases the words of the book sounded better than a modern rewrite, so in this case he was unnecessarily modest. Also, a Ballantine version, while popularizing the book in America, infringed his copyright, so that at first he was deprived of badly needed royalties. But there was more than that at work; what else caused him to distrust American men, is not clear.
However, I find increasing evidence since that time that he was right to do so. I first read the books in the early 1960s, so I have watched the phenomenon unfold since the beginning. The divergence began then, I think.
One of the notable features about The Lord of the Rings that I have noted is that it has appealed to an unusual number of women – not only men. This was unusual for its time: fantasy then in America was focused on Robert Howard’s over-muscled superheroes, with the occasional female jock who somehow lacked the basic protective armor except in strategic but minimal places – not the kind of thing that many women then or now find interesting, except as an obvious example of men’s incomprehensible, irritating taste. Moreover, as female norms have changed over here, Tolkien’s focus on men characters would normally seem dated.
The key, I think, lies in a surprising amount of “rethinking” underlying the text. Not that Tolkien was a proto-feminist. However, there is a very interesting story called “Aldarion and Eldaris” set in the same world that he wrote at about that time, that clearly enters into his thinking. In it, he sets out at great length and with some sympathy the objections of the partner whom the hero-king on his quest has left. Moreover, in her own way Eldaris sets up a counter-kingdom excluding men, showing clear leadership qualities. We see traces of this not in Arwen, as the movie would have it, but in Eowyn, who is able to make the switch from killing things to being an equal ruler who grows things.
This, in turn, allows women who read it for the first time to feel connected to the story, and then to appreciate Tolkien’s focus on relationships and beauty. It was noteworthy that when the movie came out, the one scholar who focused on Tolkien’s vision of a nature that was so alive it glowed with an inner light, was the one woman scholar quoted.
However, right from the start, many American men have read these same sections with impatience, even scorn at Tolkien’s style. Early on, Gary Gygax, creator of the Dungeons and Dragons game that was in many ways a straightforward elaboration of Tolkien’s cast of creatures, was quoted as saying that he could write better, because he would go straight to the action. Peter Jackson (yes, I know he’s from New Zealand, but he was clearly tailoring the movie to American tastes) did his very best to keep the sense of dread and action constant, and dropped the much of the first “book” in consequence. Moreover, his experience in horror movies tailored to American tastes, I think, led him to the understandable decision to give a “horror” or “action” tinge to all the scenes, especially the fight scenes. This movie, in turn, reinforced a new generation of American men who may never have read the book in the belief that it’s about plucky little and big superheroes who go out and save the world from evil by winning wars.
This is so far from Tolkien’s apparent thinking that it’s hard to know where to begin. His experience in the trenches of WW I left him with what today might be described as Post Traumatic Stress Disorder, and he was reluctant to have his son serve in WW II, because he felt that it was being used by the government to demonize the enemy. Lord of the Rings reflects that thinking: war is a horrible experience (“it was Sam’s first experience of a battle, and he didn’t like it much”), and a last resort, and those who wage it must be very clear-eyed about why it is necessary and how to minimize its effects. The Lord of Gondor, who believes the struggle is all about Gondor and only he can save it, is just as destructive as the counselor in Rohan who betrays the kingdom in order to “get” the woman he lusts after.
The second key point is that the story is about people changing – or not. Above all, it’s a story about Frodo changing. Perhaps the sentence that is most charged with meaning in the entire story is this: “I wanted to save the world, and it was saved – but not for me.” The entire first book is filled with meetings in which Frodo’s (and our) vision of the world is stretched, and stretched again, until he understands much of what is valuable in the world from its beginning until now. Then – and this point cannot be emphasized enough – he sets out to help save that world, and he fails. The result of that failure – the failure to resist the Ring enough to cast it in the Fire – and of his wounds, the equivalent of PTSD, is that he can’t stop missing the Ring, and he can’t enjoy the world that is saved, and especially because many of the things he wanted to save are vanishing anyway. I repeat: this is not a story about superheroes winning wars; it is about people enlarging their vision so they understand what’s at stake, instead of thinking it’s all about them and only they know the right answer. “Even Sauron was not evil in the beginning”; and that was the key mistake Sauron made.
The third key point is that in Tolkien’s world, not only the people are connected parts of an integral whole, but so is nature. The weather, the trees, all affect people’s moods and thinking, and are in turn fostered and destroyed by people. This is no abstract “the hero and his buddies go on a quest” folktale. It is a story in which your success is connected to the success of Brand and Dain halfway around the world, and your failure at Weathertop to the north is connected to the success of the Haradrim at Pelargir in the uttermost south. It is a story in which you would not succeed unless you helped save the trees of Fangorn Forest, and unless the wind from the South dissipated the storm of Mordor. If you lose that connection, the world is “blasted beyond repair, unless the healing hand of the Sea should cover the land in merciful oblivion.”
OK, so back to my quotes. It’s about plucky hobbits and evil trolls and fighting the enemy – yes, that sounds like an American man; and is the opposite of Tolkien. It’s time to cut down the evil government, we know how to do it, and we don’t believe the alarmists who say the cure is worse than the disease – yes, that sounds like an American man; and is the opposite of Tolkien. Let’s cut back on environmental regulation to save money, and drill more oil for our energy needs, as the rich suggest; well, let’s see what Tolkien says: “He started shipping food down south. People didn’t like it, what with winter coming on … But lately, he’s been pouring filth out of that mill for no reason, fouling the water …” When there’s a clear connection between present increases in fossil fuels and future injury of most people’s habitat, Tolkien would very clearly vote for better environmental protection and less drilling. Whether you think he’s right or not, he very likely would regard the US House efforts in this regard as clearly evil.
There’s a wonderful quote I have never been able to track down, that runs something like this: “When a man first commits murder, it is to be expected that he will then pass to assault and battery; he may indeed go on to wanton destruction of personal property, and in some cases from thence to defacement of public property; indeed, as hard as it is to believe, even violation of the Sabbath may not be beyond his reach.” I realize that in the scale of things, compared to risking the full faith and credit of the US government and threatening the global economy, trashing JRR Tolkien’s deep personal beliefs may not be of the same measure. But still, you American men, can you not at least spare those few of us who appreciate the full measure of his work the act of trampling on his grave? At long last, sirs, have you no shame? No shame at all?
Hello?
Saturday, July 30, 2011
Friday, July 29, 2011
OMG This Math Is Depressing
I just had the dubious pleasure of reading a transcript of an interview related to the “Arctic scientist muzzled for paper about polar bears.” It is incredibly depressing, because of the lack of knowledge of absolutely basic math it reveals – and not by the scientist, who does just fine.
Here is my summary of the interview. Let’s see if you can do better than the interviewers. I have even simplified the numbers ever so slightly.
Two Interviewers: Hi. We’re here from the investigative branch of the department to investigate allegations of scientific misconduct in a paper you wrote about a sudden apparent increase in deaths of polar bears.
Scientist: Do you have the scientific background to understand the paper?
Interviewers: No.
Scientist: OK, I’ll do my best.
We had been doing surveys of whales up here off the Alaska coast for 20 years, noting all other creatures out there as well. Each sweep covers (randomly) 10% of the total area we watch over. One year, for the first time, the ice moved well away from the land. On our next sweep that year, we saw four polar bears swimming. On the sweep after that, we saw three dead polar bears. Now, we couldn’t ever remember seeing such a thing, so I went and checked the notes and checked the memory and notes of the guy who had been doing this before me, since the beginning, and we’d never seen such a thing. So we wrote up a paper about it, passed it by everyone at the agency, had it anonymously peer reviewed by three people, and it was published by Polar Biology.
Interviewers: OK, so what you’re saying is, you saw 7 polar bears. How can you say there were 30 dead polar bears out there?
Scientist: What?! I didn’t say there were 7 dead polar bears – I said there were three.
Interviewers: No, in your paper, you say 4 polar bears on one sweep, and 3 in the next. Four plus three equals seven.
Scientist: But …
Interviewers: Also, why didn’t you say 7 dead polar bears instead of 30, since that’s all you saw?
Scientist: Look, in the first place I only swept 10% of the area, so I multiplied the number I saw by 10 …
Interviewers: Why would you multiply by 10?
Scientist: Excuse me, but have you ever taken any fifth grade math?
Interviewer: And even if it was OK to multiply by 10, that would mean you were claiming you saw 70 dead polar bears.
Scientist: No, I’m claiming I saw 3 dead polar bears, and that the best guess for the total in my area was 30 dead polar bears.
Interviewers: Ah, so let me read back to you what you have said. ‘I am claiming in my paper that it is likely that there are 30 dead polar bears out there.’ Is that correct?
Scientist: No. Have you ever taken any statistics? Even just a little? It is not “likely” that there are 30 dead polar bears out there. It’s just the most likely number, and there is an almost 50% chance of a number less than that, and an almost 50% chance of a number grea - …
Interviewers: Well, I think we have all we need.
Scientist: In that case, on the record, let me tell you what’s really going on. First, the purpose of the paper was not to establish a final determination of what was going on, but to say that something odd was going on. Second, my hypothesis – that there are increased polar bear deaths because of ice withdrawal from land – has been amply proven since by scientific research, anonymously peer reviewed. Third, this department has persistently attempted to prevent me and others from publishing any research that might support global warming, even though this research is a clear part of my job as a scientist and a clear part of my task in this department. Fourth, I am supposed to be checking out anything that might affect the natives here, not just whales. They want to know about my research, whales and otherwise. The only people who don’t are the oil companies whose permits might be affected and the political appointees in this department who seem to be doing their bidding. Instead of investigating me, why don’t you investigate them for “scientific misconduct”? I can certainly document attempts to distort my research, to the point where I took my name off the product …
Interviewers: Goodbye.
Here is my summary of the interview. Let’s see if you can do better than the interviewers. I have even simplified the numbers ever so slightly.
Two Interviewers: Hi. We’re here from the investigative branch of the department to investigate allegations of scientific misconduct in a paper you wrote about a sudden apparent increase in deaths of polar bears.
Scientist: Do you have the scientific background to understand the paper?
Interviewers: No.
Scientist: OK, I’ll do my best.
We had been doing surveys of whales up here off the Alaska coast for 20 years, noting all other creatures out there as well. Each sweep covers (randomly) 10% of the total area we watch over. One year, for the first time, the ice moved well away from the land. On our next sweep that year, we saw four polar bears swimming. On the sweep after that, we saw three dead polar bears. Now, we couldn’t ever remember seeing such a thing, so I went and checked the notes and checked the memory and notes of the guy who had been doing this before me, since the beginning, and we’d never seen such a thing. So we wrote up a paper about it, passed it by everyone at the agency, had it anonymously peer reviewed by three people, and it was published by Polar Biology.
Interviewers: OK, so what you’re saying is, you saw 7 polar bears. How can you say there were 30 dead polar bears out there?
Scientist: What?! I didn’t say there were 7 dead polar bears – I said there were three.
Interviewers: No, in your paper, you say 4 polar bears on one sweep, and 3 in the next. Four plus three equals seven.
Scientist: But …
Interviewers: Also, why didn’t you say 7 dead polar bears instead of 30, since that’s all you saw?
Scientist: Look, in the first place I only swept 10% of the area, so I multiplied the number I saw by 10 …
Interviewers: Why would you multiply by 10?
Scientist: Excuse me, but have you ever taken any fifth grade math?
Interviewer: And even if it was OK to multiply by 10, that would mean you were claiming you saw 70 dead polar bears.
Scientist: No, I’m claiming I saw 3 dead polar bears, and that the best guess for the total in my area was 30 dead polar bears.
Interviewers: Ah, so let me read back to you what you have said. ‘I am claiming in my paper that it is likely that there are 30 dead polar bears out there.’ Is that correct?
Scientist: No. Have you ever taken any statistics? Even just a little? It is not “likely” that there are 30 dead polar bears out there. It’s just the most likely number, and there is an almost 50% chance of a number less than that, and an almost 50% chance of a number grea - …
Interviewers: Well, I think we have all we need.
Scientist: In that case, on the record, let me tell you what’s really going on. First, the purpose of the paper was not to establish a final determination of what was going on, but to say that something odd was going on. Second, my hypothesis – that there are increased polar bear deaths because of ice withdrawal from land – has been amply proven since by scientific research, anonymously peer reviewed. Third, this department has persistently attempted to prevent me and others from publishing any research that might support global warming, even though this research is a clear part of my job as a scientist and a clear part of my task in this department. Fourth, I am supposed to be checking out anything that might affect the natives here, not just whales. They want to know about my research, whales and otherwise. The only people who don’t are the oil companies whose permits might be affected and the political appointees in this department who seem to be doing their bidding. Instead of investigating me, why don’t you investigate them for “scientific misconduct”? I can certainly document attempts to distort my research, to the point where I took my name off the product …
Interviewers: Goodbye.
Tuesday, July 26, 2011
Don't Cloud the Computing Carbon Emissions Reduction Issue
I recently read a post by Jon Koomey, Consulting Professor at Stanford, at www.climateprogress.org, called “4 reasons why cloud computing is efficient”. He argues (along with some other folks) that cloud computing – by which he apparently means almost entirely public clouds – is much more beneficial for reducing computing’s carbon emissions than the real-world alternatives. As a computer industry analyst greatly concerned by carbon emissions, I'd like to agree with Jon; I really would. However, I feel that his analysis omits several factors of great importance that lead to a different conclusion.
The study he cites compares the public cloud -- not a private or hybrid cloud -- to "the equivalent". It is clear from context that it is talking about a "scale-out" solution of hundreds and thousands of small servers, each with a few processors. This is, indeed, typical of most public clouds, and other studies have shown that in isolation, these servers do indeed have a utilization rate of perhaps 10-20%. However, the scale-up hundreds-of-processors servers that are a clear alternative, and which are typically not used in public clouds (but are often used in private clouds), have a far better record. The most recent mainframe implementations, which support up to a thousand "virtual machines", achieve utilization rates of better than 90% -- a three times better carbon efficiency than the public cloud, right up front.
The second factor Jon omits is the location of the public cloud. According to Carol Baroudi, author of "Green IT For Dummies", only one public cloud site that she studied is located in an area that has a strong record of electricity that is carbon-emission-light (Oregon). The others are in areas where the energy is "cheaper" because of fossil fuel use. That may change; but you don't move a public cloud data center easily, because the petabytes of data stored there to deliver high performance to nearby customers doesn't move easily, even over short distances. Corporate data centers are more movable, because the data storage sizes are smaller and they have extensive experience with "consolidation". While until recently most organizations were not conscious of the carbon-emission effects of their location, it appears that companies like IBM are indeed more conscious of this concern than most public cloud providers.
The third factor that Jon omits is what I call "flight to the dirty". High up-front costs of more efficient scale-up servers leads unconsciously to use of less energy-efficient scale-out servers. Controls over access to public and private clouds and data centers, and visibility of their costs, moves consumer and local computing onto PCs and smartphones. Apparent cheapness of labor and office space in developing nations leads companies to rapidly implement data centers and computing there using existing energy-inefficient and carbon-wasting electrical supplies. All of these "carbon inefficiencies" are not captured in typical analyses.
Personally, I come to three different conclusions:
1. The most carbon-efficient computing providers use scale-up computing and integrated energy management, and so far most if not all of those are private clouds.
2. The IT shops that are most effective at improving carbon efficiency in computing monitor energy efficiency and carbon emissions use not only inside but outside the data center, and those inevitably are not public clouds.
3. Public clouds, up to now, appear to be "throwing good money after bad" in investing in locations that will be slower to provide carbon-emission-light electricity -- so that public clouds may indeed slow the movement towards more carbon-efficient IT.
A better way of moving computing as a whole towards carbon-emission reductions is by embedding carbon monitoring and costing throughout the financials and computers of companies. Already, a few visionary companies are doing just that. Public cloud companies should get on this bandwagon, by making their share of carbon emissions transparent to these companies (and by doing such monitoring and costing themselves). This should lead both parties to the conclusion that they should either relocate their data centers or develop their own solar/wind energy sources, that they should move towards scale-up servers and integrated energy management, and that they should not move to less costly countries without achieving energy efficiency and carbon-emission reduction for their sites up front.
The study he cites compares the public cloud -- not a private or hybrid cloud -- to "the equivalent". It is clear from context that it is talking about a "scale-out" solution of hundreds and thousands of small servers, each with a few processors. This is, indeed, typical of most public clouds, and other studies have shown that in isolation, these servers do indeed have a utilization rate of perhaps 10-20%. However, the scale-up hundreds-of-processors servers that are a clear alternative, and which are typically not used in public clouds (but are often used in private clouds), have a far better record. The most recent mainframe implementations, which support up to a thousand "virtual machines", achieve utilization rates of better than 90% -- a three times better carbon efficiency than the public cloud, right up front.
The second factor Jon omits is the location of the public cloud. According to Carol Baroudi, author of "Green IT For Dummies", only one public cloud site that she studied is located in an area that has a strong record of electricity that is carbon-emission-light (Oregon). The others are in areas where the energy is "cheaper" because of fossil fuel use. That may change; but you don't move a public cloud data center easily, because the petabytes of data stored there to deliver high performance to nearby customers doesn't move easily, even over short distances. Corporate data centers are more movable, because the data storage sizes are smaller and they have extensive experience with "consolidation". While until recently most organizations were not conscious of the carbon-emission effects of their location, it appears that companies like IBM are indeed more conscious of this concern than most public cloud providers.
The third factor that Jon omits is what I call "flight to the dirty". High up-front costs of more efficient scale-up servers leads unconsciously to use of less energy-efficient scale-out servers. Controls over access to public and private clouds and data centers, and visibility of their costs, moves consumer and local computing onto PCs and smartphones. Apparent cheapness of labor and office space in developing nations leads companies to rapidly implement data centers and computing there using existing energy-inefficient and carbon-wasting electrical supplies. All of these "carbon inefficiencies" are not captured in typical analyses.
Personally, I come to three different conclusions:
1. The most carbon-efficient computing providers use scale-up computing and integrated energy management, and so far most if not all of those are private clouds.
2. The IT shops that are most effective at improving carbon efficiency in computing monitor energy efficiency and carbon emissions use not only inside but outside the data center, and those inevitably are not public clouds.
3. Public clouds, up to now, appear to be "throwing good money after bad" in investing in locations that will be slower to provide carbon-emission-light electricity -- so that public clouds may indeed slow the movement towards more carbon-efficient IT.
A better way of moving computing as a whole towards carbon-emission reductions is by embedding carbon monitoring and costing throughout the financials and computers of companies. Already, a few visionary companies are doing just that. Public cloud companies should get on this bandwagon, by making their share of carbon emissions transparent to these companies (and by doing such monitoring and costing themselves). This should lead both parties to the conclusion that they should either relocate their data centers or develop their own solar/wind energy sources, that they should move towards scale-up servers and integrated energy management, and that they should not move to less costly countries without achieving energy efficiency and carbon-emission reduction for their sites up front.
Tuesday, July 19, 2011
Will We All Speak IT?
At a recent teleconference, I heard the speaker first refer to “provisioning” a solution and then to people who would “on-board” that solution. It suddenly struck me that I was witnessing a new stage in the intrusion of language derived from computing into our daily lives.
Here’s how it used to go: we grew up with the rules of grammar and vocabulary as taught us in school, and as computer technology evolved, its new ideas and products used the words of, and fit neatly into, the English we were taught. A machine made a computation of a number, it computed, it was a computer. A piece of information in a computer, from the Latin, was a datum, plural data, stored in a data base, managed by a database management system.
In the same way, the jargon of computer techies, even when it spilled over into the population at large, was ultimately derived from ideas already in English. Bogosity – the quality of being bogus; bogon – a unit of bogosity. Misfeature – a combination of mistake and feature, a mistake that was touted by marketdroids (mindless marketers) as a feature.
ITSpeak 2.0
I first noticed things beginning to change in the late 1990s. In the 1980s, I had been frustrated as a purist by the universal tendency of my fellow programmers to refer to an example problem, an example case, an example screen, instead of a sample screen, as I had always been taught. Still, until the late 1990s, I never saw anyone else use “example” as an adjective; then marketing, and sometimes business blogs for a general audience, started to use “example” that way. However, even in the computing industry, there was strong purist resistance. I well remember the difficulties I had at Aberdeen Group persuading the editors that in computing, it was now “lifecycle”, not “life cycle”. Today, I can’t remember having seen “life cycle” in years.
In some ways, these tinkerings with basic English had a positive effect, I believe. Using “example” for “sample” is a good case in point: the meaning is clear from context, and it’s easier to use one word for the concept than learn two.
But the changes were not all for the good. I still remember some annoying marketer at Sybase, iirc, deciding in the late 1990s that from now on it was to be “database”, not “database management system”. The result was that users ever since are constantly confused as to whether they are talking about the software, or the data stored for use by that software – which I now have to always call the “data store” to make myself clear. In the same way, Enterprise Information Integration is now “data virtualization”, which captures only half the qualities of the software.
And, of course, with the advent of the Web IT words became far more ubiquitous, from blog to tweet. Sadly, these words have now become a measure of age, as each successive fad embeds its IT words into popular language, and we now divide generations into those that know what “to friend” means, and those who don’t.
ITSpeak Takes Over?
Even so, I didn’t see until now any clear indication that computer jargon was crowding out basic English words. But consider “provision”. Until very recently, a male was a “provider” who made enough money to put “provisions” on the table for the family. Now, IT has taken the word and abstracted it, to describe a general process of populating the empty shell of any new solution, and turned it from a noun into a verb. This major change in meaning is coming from IT, but it isn’t stopping there. Pretty soon, I expect to hear supermarkets start talking about “provisioning” their new stores, and then home builders and buyers start to talk about “provisioning” the new house with furniture.
The same goes for “on-board”. Like “friend” and “provision”, it’s a straightforward conversion of another part of speech to a verb. Like “provision”, it is a major switch in meaning that carries with it the notion of a process rather than an individual act. In the teleconference, it appeared to mean users carrying out the tasks of becoming part of a new IT solution themselves. But, again, I expect that soon employees will be expected to “on-board” themselves via “self-service portals”, and then students starting at college, and then what? Will we create new automated birthing centers where newborns will be expected to “on-board” themselves by responding to automated nipples? Will end-of-life hospices be referred to as “off-boarding centers?”
What If ITSpeak Does Take Over?
If we all starting talking ITSpeak – a language many of whose concepts originated in computing – is that good or bad? I believe that it’s way too early to tell. On the positive side, many of these words come from trying to distinguish more clearly between similar things, when the differences matter. The idea that a misfeature is not the same as a feature is important, and useful to us.
On the negative side, some historical richness of meaning may be lost. Always employing “utilize” instead of “use” (not really ITSpeak, but analogous) is not only unnecessarily lengthy, it also misses the importance in history of the distinction between “applying an object for a use for which it is designed” and “applying an object whether it helps in a task or not”. You utilize a Phillips screwdriver in following the directions for assembling a kid’s toy; you use a user’s manual for Microsoft Word even though it often doesn’t give you the answer you need.
No, my point here is that I think this represents a fundamental shift in our thinking, as we begin to see the world as IT folks do. At the least, this might mean that we think more of software-type abstractions and less of “legacy” physical objects, see life more in terms of processes and less in terms of interactions, and view others less in terms of irrationality and psychology and more in terms of categories and connections. So to maximize the chances of something good coming out of this, I think we ought to at least recognize that it is going on.
Will we all speak IT, all the time? Someday, quite possibly. Right now, it’s time to prepare to provision, so that we may on-board effectively.
Here’s how it used to go: we grew up with the rules of grammar and vocabulary as taught us in school, and as computer technology evolved, its new ideas and products used the words of, and fit neatly into, the English we were taught. A machine made a computation of a number, it computed, it was a computer. A piece of information in a computer, from the Latin, was a datum, plural data, stored in a data base, managed by a database management system.
In the same way, the jargon of computer techies, even when it spilled over into the population at large, was ultimately derived from ideas already in English. Bogosity – the quality of being bogus; bogon – a unit of bogosity. Misfeature – a combination of mistake and feature, a mistake that was touted by marketdroids (mindless marketers) as a feature.
ITSpeak 2.0
I first noticed things beginning to change in the late 1990s. In the 1980s, I had been frustrated as a purist by the universal tendency of my fellow programmers to refer to an example problem, an example case, an example screen, instead of a sample screen, as I had always been taught. Still, until the late 1990s, I never saw anyone else use “example” as an adjective; then marketing, and sometimes business blogs for a general audience, started to use “example” that way. However, even in the computing industry, there was strong purist resistance. I well remember the difficulties I had at Aberdeen Group persuading the editors that in computing, it was now “lifecycle”, not “life cycle”. Today, I can’t remember having seen “life cycle” in years.
In some ways, these tinkerings with basic English had a positive effect, I believe. Using “example” for “sample” is a good case in point: the meaning is clear from context, and it’s easier to use one word for the concept than learn two.
But the changes were not all for the good. I still remember some annoying marketer at Sybase, iirc, deciding in the late 1990s that from now on it was to be “database”, not “database management system”. The result was that users ever since are constantly confused as to whether they are talking about the software, or the data stored for use by that software – which I now have to always call the “data store” to make myself clear. In the same way, Enterprise Information Integration is now “data virtualization”, which captures only half the qualities of the software.
And, of course, with the advent of the Web IT words became far more ubiquitous, from blog to tweet. Sadly, these words have now become a measure of age, as each successive fad embeds its IT words into popular language, and we now divide generations into those that know what “to friend” means, and those who don’t.
ITSpeak Takes Over?
Even so, I didn’t see until now any clear indication that computer jargon was crowding out basic English words. But consider “provision”. Until very recently, a male was a “provider” who made enough money to put “provisions” on the table for the family. Now, IT has taken the word and abstracted it, to describe a general process of populating the empty shell of any new solution, and turned it from a noun into a verb. This major change in meaning is coming from IT, but it isn’t stopping there. Pretty soon, I expect to hear supermarkets start talking about “provisioning” their new stores, and then home builders and buyers start to talk about “provisioning” the new house with furniture.
The same goes for “on-board”. Like “friend” and “provision”, it’s a straightforward conversion of another part of speech to a verb. Like “provision”, it is a major switch in meaning that carries with it the notion of a process rather than an individual act. In the teleconference, it appeared to mean users carrying out the tasks of becoming part of a new IT solution themselves. But, again, I expect that soon employees will be expected to “on-board” themselves via “self-service portals”, and then students starting at college, and then what? Will we create new automated birthing centers where newborns will be expected to “on-board” themselves by responding to automated nipples? Will end-of-life hospices be referred to as “off-boarding centers?”
What If ITSpeak Does Take Over?
If we all starting talking ITSpeak – a language many of whose concepts originated in computing – is that good or bad? I believe that it’s way too early to tell. On the positive side, many of these words come from trying to distinguish more clearly between similar things, when the differences matter. The idea that a misfeature is not the same as a feature is important, and useful to us.
On the negative side, some historical richness of meaning may be lost. Always employing “utilize” instead of “use” (not really ITSpeak, but analogous) is not only unnecessarily lengthy, it also misses the importance in history of the distinction between “applying an object for a use for which it is designed” and “applying an object whether it helps in a task or not”. You utilize a Phillips screwdriver in following the directions for assembling a kid’s toy; you use a user’s manual for Microsoft Word even though it often doesn’t give you the answer you need.
No, my point here is that I think this represents a fundamental shift in our thinking, as we begin to see the world as IT folks do. At the least, this might mean that we think more of software-type abstractions and less of “legacy” physical objects, see life more in terms of processes and less in terms of interactions, and view others less in terms of irrationality and psychology and more in terms of categories and connections. So to maximize the chances of something good coming out of this, I think we ought to at least recognize that it is going on.
Will we all speak IT, all the time? Someday, quite possibly. Right now, it’s time to prepare to provision, so that we may on-board effectively.
Friday, July 8, 2011
The IBM Acquisition Game
Recently, I was contacted by a firm called Software Advice, which has a very interesting business model: pay-for-results advice on short lists for IT buying. They just posted a blog on “IBM M&A: Who’s Next”, and were interested in my thoughts. I took a look, and found it quite impressive; and therefore, in accordance with my philosophy of comforting the afflicted and afflicting the comfortable, I decided to pick nits about their conclusions. I believe that both their and my thoughts offer some potentially useful insights to IT buyers, not just about IBM, but about how quickly vendors are likely to deliver what users need in the next 1-2 years.
The reason it’s not only fun but instructive to play the IBM acquisition game is that it implicitly asks, given user needs over the next 1-2 years, what are the holes in IBM’s lineup to meet those needs that it should fill immediately? And that also allows us to ask, if they don’t fill those needs, will it come back to bite them, because someone else is likely to beat them to the punch? And then we can ask, will folks really want to use someone else besides IBM if IBM doesn’t supply this need – or is this something for which the IT buyer will have to “roll his or her own” at greater expense?
So let the game begin!
Historical Nits
Software Advice begins with a graphic nicely capturing the extent of IBM’s acquisitions over the last decade or so. The problem lies in the headings that split the acquisitions into “applications”, “infrastructure”, and “services”. You see, IBM has been firm in disclaiming any intention of getting into “applications”, and so most if not all of the acquisitions classified as “applications” are in fact what is usually called “infrastructure software.” That also means that almost identical infrastructure software is in one case classified as an application and in another as infrastructure – for example, the Rational software development toolset is counted as infrastructure software, but the Telelogic requirements management toolset, which is almost always used as the first step in the development process as part of a “lifecycle” software development toolset, is classified as an “application”. Hence it’s very easy to assume that IBM doesn’t need any applications acquisitions.
The interesting thing about this nit is that it raises the question: should IBM, at long last, go into the “apps business”, either on the business or consumer side? Yes, they’ve never needed to before, since until recently both Oracle and SAP (the dominant players in enterprise apps) have shown themselves willing to support all hardware vendors, but now that Oracle owns Sun and has shown it can play hardball with respect to HP Itanium, should IBM rethink that posture? Does the market now need a platform that it can be sure its enterprise or other business-critical applications will support?
The answer to that, I believe, depends on SAP. In other words, whatever the merits of other app vendors like Salesforce.com, the run-the-business applications of SAP are presently the main alternatives to Oracle Apps. If SAP remains a strong alternative, then IBM is entirely correct in continuing to keep its hands off enterprise application companies, reinforcing its image as less prone to vendor lock-in than Microsoft or Oracle.
And yet, I have to say, whether SAP will be a strong alternative remains an open question. SAP has made some major acquisitions of its own, like Business Objects and Sybase, which have taken it down the software stack with some quality infrastructure software. However, it is not yet clear that SAP can drive rapidly-changing database technology ahead fast enough to provide a long-run all-in-one enterprise-app or analytics alternative to Oracle Apps. The signs are very good: SAP appears to understand the importance of Sybase, and the potential of integrating its technologies with SAP’s present stack. Still, SAP has to execute that strategy.
I think it would make most sense for IBM to beef up its SAP application support with a smaller acquisition or two, this time of cross-database administrative tools that specialize in Sybase. Later, of course, if things get bad, IBM could always acquire SAP. In the meanwhile, the IT buyer should note that IBM and Oracle SAP support is a space to watch.
Strategic Investment Nits
Software Advice then goes on to identify general areas of future customer need where IBM may need to acquire companies. Their main focus – certainly a good one – is cloud administration. They also note – although with a much shorter analysis – IBM’s need to expand its analytics and BI offerings even further – and that makes sense too. Everyone, not just IBM, is scrambling to fill in the blanks and achieve fully automated hybrid-cloud deployment and administration.
However, I would disagree with their analysis of virtualization as a key area of acquisition. While VMWare continues to be an outstanding success, the pace of virtualization to a public cloud – the main lock-in for VMWare – remains quite slow. Private clouds in larger enterprises tend to be top-down, which means that IBM is doing quite well at driving its own virtualization software across the data center. I would argue that IBM has no need to acquire either VMWare or EMC either now or in the next two years – and a good reason to wait to see what happens as Oracle continues to compete with EMC more strongly in storage.
What might make sense, on the other hand, is for IBM to consider acquiring Red Hat. The two have been working together pretty effectively, and IBM needs to build up its open-source brand as a new market of tech-savvy open-source-oriented firms opens up. As long as it leaves the open-source culture of Red Hat in place, IBM can use Red Hat as an “early warning system” for changes in the new market – because that market cares less about VMWare vs. KVM and more about open-source-based services for cloud deployment.
My second nit regards mobile technology. It appears likely that the movement of mobile business workers towards having a laptop for some situations and a small-form-factor smartphone or tablet for others has reached flood stage, and needs to be addressed better. Sybase would have been a great entry point, but it’s not available now. Buying Apple would be fun to imagine, but seems impossible to achieve. Perhaps IBM might consider RIM. The value-add of Blackberry cell phones has always been in their business software, and while they are under threat in the consumer market, business users still find them appropriate. Here is an area of great user need where all vendors – not just IBM – fall short; so if IBM doesn’t do a good acquisition soon, IT buyers should anticipate a lot of “roll your own”.
My third nit concerns the whole area of BI/analytics. There seems to be a pervasive confusion of BI, analytics, and Big Data, as if they are the same thing. My short take on the differences is: BI is basic repeated reporting and querying plus ad-hoc or goal-oriented querying, both for corporate; analytics is ad-hoc or goal-oriented querying, not only for corporate but also embedded in other software across the organization (e.g., security and administrative analytics); Big Data is a wide range of new large-footprint data types, more usually on the Web, that provides insights into such new marketing topics as social media, and therefore typically complements BI with extra-organizational data. The result is that any good push to meet user needs is going to need to tackle all three areas.
As I noted in a previous blog post, what’s users need in all three areas is some combination of scaling and user friendliness, especially for the burgeoning SMB BI market. It’s hard to buy or create user friendliness – the BI market still has a ways to go in this area. However, there are ways that IBM could improve its scalability. For one thing, Netezza and the new IBM z appliance have columnar database technology that’s too tied to a particular appliance. It’s not clear just how fast IBM will move into this area, but Amazon’s investment in ParAccel reminds us that there are still interesting columnar database suppliers out there.
On the Big Data side, users must also consider integrating BI with file-system-stored Web data such as that accessed via Hadoop. There are quite a few NoSQL open-source efforts that may be worth productizing and integrating with DB2 or a columnar database. Again, this is an area where all vendors – not just IBM – need to do more to make the path to combined BI/analytics/Big Data clear. In the meanwhile, IT buyers should think carefully about buying from only one database vendor, because until one of them shows they have the full Big Data story there is no guarantee that any of them will not fall short of what users need – and past experience suggests that database lock-in is about as locked in as you can get.
Endgame
Having played Software Advice’s IBM Acquisition Game, I draw three conclusions from it. First, IBM is in a surprisingly strong position going forward. There is no obvious hole in its solution lineup that immediately threatens the company, and that it cannot fix by careful re-tuning of the same strategies it has had up to now. And that’s good news for IT buyers.
Which leads me to conclusion two: there’s still enough choice in the market. We have seen a lot of acquisitions, not just from IBM but from other major vendors, in the last decade; but the fact that there are still smaller companies out there to plug holes for IBM and others means that IT buyers can still find a way to stitch together a solution where one’s favored vendor doesn’t quite cover all needs.
And that leads to conclusion three: despite the hype, users are still a long way from taking full advantage of mobile, cloud, or analytics/Big Data. This may well be a transition as slow and incomplete as the one in the early 2000s to service-oriented architectures – and don’t get me started about Business Process Integration. In fact, it might be a good idea to play the Acquisition Game with other vendors on your short lists – and then see what those vendors do in the real world to cover the holes you find, before committing irrevocably and totally to one of them. That’s not to say you shouldn’t press ahead with all deliberate speed, as your competitors will be doing -- but cover your bets.
The reason it’s not only fun but instructive to play the IBM acquisition game is that it implicitly asks, given user needs over the next 1-2 years, what are the holes in IBM’s lineup to meet those needs that it should fill immediately? And that also allows us to ask, if they don’t fill those needs, will it come back to bite them, because someone else is likely to beat them to the punch? And then we can ask, will folks really want to use someone else besides IBM if IBM doesn’t supply this need – or is this something for which the IT buyer will have to “roll his or her own” at greater expense?
So let the game begin!
Historical Nits
Software Advice begins with a graphic nicely capturing the extent of IBM’s acquisitions over the last decade or so. The problem lies in the headings that split the acquisitions into “applications”, “infrastructure”, and “services”. You see, IBM has been firm in disclaiming any intention of getting into “applications”, and so most if not all of the acquisitions classified as “applications” are in fact what is usually called “infrastructure software.” That also means that almost identical infrastructure software is in one case classified as an application and in another as infrastructure – for example, the Rational software development toolset is counted as infrastructure software, but the Telelogic requirements management toolset, which is almost always used as the first step in the development process as part of a “lifecycle” software development toolset, is classified as an “application”. Hence it’s very easy to assume that IBM doesn’t need any applications acquisitions.
The interesting thing about this nit is that it raises the question: should IBM, at long last, go into the “apps business”, either on the business or consumer side? Yes, they’ve never needed to before, since until recently both Oracle and SAP (the dominant players in enterprise apps) have shown themselves willing to support all hardware vendors, but now that Oracle owns Sun and has shown it can play hardball with respect to HP Itanium, should IBM rethink that posture? Does the market now need a platform that it can be sure its enterprise or other business-critical applications will support?
The answer to that, I believe, depends on SAP. In other words, whatever the merits of other app vendors like Salesforce.com, the run-the-business applications of SAP are presently the main alternatives to Oracle Apps. If SAP remains a strong alternative, then IBM is entirely correct in continuing to keep its hands off enterprise application companies, reinforcing its image as less prone to vendor lock-in than Microsoft or Oracle.
And yet, I have to say, whether SAP will be a strong alternative remains an open question. SAP has made some major acquisitions of its own, like Business Objects and Sybase, which have taken it down the software stack with some quality infrastructure software. However, it is not yet clear that SAP can drive rapidly-changing database technology ahead fast enough to provide a long-run all-in-one enterprise-app or analytics alternative to Oracle Apps. The signs are very good: SAP appears to understand the importance of Sybase, and the potential of integrating its technologies with SAP’s present stack. Still, SAP has to execute that strategy.
I think it would make most sense for IBM to beef up its SAP application support with a smaller acquisition or two, this time of cross-database administrative tools that specialize in Sybase. Later, of course, if things get bad, IBM could always acquire SAP. In the meanwhile, the IT buyer should note that IBM and Oracle SAP support is a space to watch.
Strategic Investment Nits
Software Advice then goes on to identify general areas of future customer need where IBM may need to acquire companies. Their main focus – certainly a good one – is cloud administration. They also note – although with a much shorter analysis – IBM’s need to expand its analytics and BI offerings even further – and that makes sense too. Everyone, not just IBM, is scrambling to fill in the blanks and achieve fully automated hybrid-cloud deployment and administration.
However, I would disagree with their analysis of virtualization as a key area of acquisition. While VMWare continues to be an outstanding success, the pace of virtualization to a public cloud – the main lock-in for VMWare – remains quite slow. Private clouds in larger enterprises tend to be top-down, which means that IBM is doing quite well at driving its own virtualization software across the data center. I would argue that IBM has no need to acquire either VMWare or EMC either now or in the next two years – and a good reason to wait to see what happens as Oracle continues to compete with EMC more strongly in storage.
What might make sense, on the other hand, is for IBM to consider acquiring Red Hat. The two have been working together pretty effectively, and IBM needs to build up its open-source brand as a new market of tech-savvy open-source-oriented firms opens up. As long as it leaves the open-source culture of Red Hat in place, IBM can use Red Hat as an “early warning system” for changes in the new market – because that market cares less about VMWare vs. KVM and more about open-source-based services for cloud deployment.
My second nit regards mobile technology. It appears likely that the movement of mobile business workers towards having a laptop for some situations and a small-form-factor smartphone or tablet for others has reached flood stage, and needs to be addressed better. Sybase would have been a great entry point, but it’s not available now. Buying Apple would be fun to imagine, but seems impossible to achieve. Perhaps IBM might consider RIM. The value-add of Blackberry cell phones has always been in their business software, and while they are under threat in the consumer market, business users still find them appropriate. Here is an area of great user need where all vendors – not just IBM – fall short; so if IBM doesn’t do a good acquisition soon, IT buyers should anticipate a lot of “roll your own”.
My third nit concerns the whole area of BI/analytics. There seems to be a pervasive confusion of BI, analytics, and Big Data, as if they are the same thing. My short take on the differences is: BI is basic repeated reporting and querying plus ad-hoc or goal-oriented querying, both for corporate; analytics is ad-hoc or goal-oriented querying, not only for corporate but also embedded in other software across the organization (e.g., security and administrative analytics); Big Data is a wide range of new large-footprint data types, more usually on the Web, that provides insights into such new marketing topics as social media, and therefore typically complements BI with extra-organizational data. The result is that any good push to meet user needs is going to need to tackle all three areas.
As I noted in a previous blog post, what’s users need in all three areas is some combination of scaling and user friendliness, especially for the burgeoning SMB BI market. It’s hard to buy or create user friendliness – the BI market still has a ways to go in this area. However, there are ways that IBM could improve its scalability. For one thing, Netezza and the new IBM z appliance have columnar database technology that’s too tied to a particular appliance. It’s not clear just how fast IBM will move into this area, but Amazon’s investment in ParAccel reminds us that there are still interesting columnar database suppliers out there.
On the Big Data side, users must also consider integrating BI with file-system-stored Web data such as that accessed via Hadoop. There are quite a few NoSQL open-source efforts that may be worth productizing and integrating with DB2 or a columnar database. Again, this is an area where all vendors – not just IBM – need to do more to make the path to combined BI/analytics/Big Data clear. In the meanwhile, IT buyers should think carefully about buying from only one database vendor, because until one of them shows they have the full Big Data story there is no guarantee that any of them will not fall short of what users need – and past experience suggests that database lock-in is about as locked in as you can get.
Endgame
Having played Software Advice’s IBM Acquisition Game, I draw three conclusions from it. First, IBM is in a surprisingly strong position going forward. There is no obvious hole in its solution lineup that immediately threatens the company, and that it cannot fix by careful re-tuning of the same strategies it has had up to now. And that’s good news for IT buyers.
Which leads me to conclusion two: there’s still enough choice in the market. We have seen a lot of acquisitions, not just from IBM but from other major vendors, in the last decade; but the fact that there are still smaller companies out there to plug holes for IBM and others means that IT buyers can still find a way to stitch together a solution where one’s favored vendor doesn’t quite cover all needs.
And that leads to conclusion three: despite the hype, users are still a long way from taking full advantage of mobile, cloud, or analytics/Big Data. This may well be a transition as slow and incomplete as the one in the early 2000s to service-oriented architectures – and don’t get me started about Business Process Integration. In fact, it might be a good idea to play the Acquisition Game with other vendors on your short lists – and then see what those vendors do in the real world to cover the holes you find, before committing irrevocably and totally to one of them. That’s not to say you shouldn’t press ahead with all deliberate speed, as your competitors will be doing -- but cover your bets.
Subscribe to:
Posts (Atom)