Wednesday, March 11, 2015

Climate Change: That Is All

In the movie M*A*S*H, someone would periodically announce upsetting news like “The US announced today that it had tested a nuclear device with one thousand times the explosive power of the bomb at Hiroshima” in a matter-of-fact voice, and would end the announcement with a brief “That is all.”

A recent study has now projected that under the “best-case scenario” (now changed to have atmospheric carbon level off at 525 ppm rather than just above 400 ppm) temperatures over the middle of the 21st century will increase by 0.5 degree F per decade, and in the Arctic by 1 degree F.  In the worst-case scenario, temperatures will increase by 1 degree F per decade, and in the Arctic by 2 degrees F.  None of these scenarios take into account melting permafrost, which is likely to increase atmospheric carbon and the increase in temperature yet further.

In the Arctic, after 2 years in which sea ice minima reverted to a little above the level of 2007, it now appears very likely that the yearly maximum will be well below past recorded maxima, and will be the first recorded maximum at less than 14 million km.  Meanwhile, a weak el Nino has been announced, and in the past el Ninos have been associated with accelerated sea ice melting over the summer.  Another recent study has shown that on average 65 % of Arctic sea ice has vanished since 1975, and 85 % at sea ice minimum in September.  Another study has found that a so-called “hiatus” or “slowdown” over the last 15-20 years in global warming does not exist once Arctic data are added.  It is anticipated that over the next few decades, the global warming will speed up instead of rising at a constant rate, and therefore warming south of the Arctic will likewise speed up.

Another study has shown that melting of Greenland land ice is proceeding faster than previously thought, and therefore its contribution to sea level rise in the next few decades is going to be greater than anticipated (in previous studies that projected a 6 foot rise by the end of the century).  In the Antarctic, likewise, studies show the present rate of land ice melt has been underestimated, and therefore that Antarctic melting will likewise contribute significantly to sea level rise.

Finally, atmospheric carbon measurements for February are now out.  For the first time since records began to be kept in the 1950s, February atmospheric carbon is above 400 ppm, and approximately 2.5 ppm above last year.

That is all.

Wednesday, March 4, 2015

Memories of Bad Development Experiences

Recently, Charles King of Pund-IT Review was kind enough to send me a link to an old blog post talking about a type of bad software development experience that the writer had undergone – one that the writer christened “A*h*-Driven Development”, or ADD.  The comments on that blog post added some other wonderful designations of bad development experiences, complete with acronyms, many of which I enjoyed as well – until it struck me that in my 12-year developer career I had not experienced any of them.
And so, Charles has inspired me to set down my own experiences of misbegotten, crippled-from-the-start development, with due attention to appropriate acronyms.  And if you don’t like it, remember, don’t blame me for this post – blame Charles.

Commentless Reverse-Engineered Ugly Development (CRUD)

In my time, I ran into two extraordinary examples of this phenomenon.  The first was the case of a talented but strange person who had single-handedly created the first version of the software for an intelligent terminal.  This was in the days when in order to develop software for certain firms, one had to write out the programs (in Intel 8080 assembler language, of course) and then submit them to typists who would type in the written program, then run the compiler and linker and passed back the results to the programmer, until the program worked. 
However, the person who created the first version did not believe in comments, so it was very difficult to determine how the program worked.  What’s more, that person had a tendency to label statements that were to be branched to with obscenities, so that the typists had reached the point of refusing to type in his code.  Above all, the person was surly to the point that asking him to do a far larger second version in a timely fashion was risky if not asking the impossible.  However, it was also necessary that the second version provide the functions that the first version did, to ensure that customers went for the upgrade.
Eventually, management bit the bullet and hired more developers who would do v2.  However, because of the lack of comments, the functions of v2 had to be reverse engineered:  these developers had to guess just how the original developer had created these functions in v1.  In the end, the development had to take a long while to create a massive initial specification that included “best guesses” as to how v1’s functions worked, and the usual discoveries of problems combined with the new problems of coordinating multiple programmers made the development process very long.  Eventually, v2 got done; but by that time, the lead in the market that v1 had given had pretty much dissipated, as I understand it.
And that talented but strange developer?  Upper management found a solution that to my mind smacks of genius:  they promoted him to be a software development manager.  No more obscenities, no more surliness; he had to get what he wanted done through other programmers, and in order to do it he even improved his physical appearance so he wasn’t so unsanitary.  In fact, after his transformation he was by no means the worst personality problem among the developers – that would be someone we called the Mad Russian.  But that’s a story for another time.
On to story number two.  This was in later times, when one had one’s own IBM terminal to type in, compile, and link programs that were written not in assembler but in what used to be called a 4GL – a language a step up from C or PL/I, and vaguely English-like in its commands.  This particular company had a “wonder installer” who knew the product inside and out and knew just how to make the latest version run superbly when he installed it at customer sites.  In effect, he was so good that he seemed to sell the customer himself. 
And then the next version of the product would come out, and if he did not install it, screams of anguish would arrive from existing customers about all sorts of vital programs that didn’t work – this despite careful efforts at backward-compatibility testing by some extremely capable programmers.  What on earth could be going on?
As it turned out, when this installer would show up at a customer site, the customer would in the course of installation say, “gee, wouldn’t it be nice if …” and the installer would say “Of course I can do that for you”, and in a flash would whip up a program based on the product that would do just what the customer wanted.  There was one problem:  that little add-on was based on the innards of version x, and when version x + 1 arrived, those innards had changed, and the add-on no longer worked.  Of course, testing did not catch the problem, because users were only supposed to use the interfaces, not the innards beneath them.  And, of course (since a 4GL was English-like, the installer assumed no commenting was needed), there were no comments explaining what the program was doing, and the installer was out doing another job, and there was a key customer screaming bloody murder.
In the end, as in case 1, some of these programs had to be “reverse engineered”.  Luckily, the programs were generally small, so it was less difficult to accomplish this.  Still, it continually slowed down development of new functionality in the product, because these programs that only one or two key customers wanted were preempting new functionality that most customers needed.
I don’t know for sure how the company finally solved this problem.  I think that they finally pulled back the installer from the field and redirected him to new-function design, where it was impossible for any of his little per-company customizations to make it into running product.  In the end, as in story 1, a good but clueless programmer created CRUD, and the company that had to clean up the mess decided it never wanted CRUD again.

Design-By-Meeting Endless Asinine Development (DEAD)

I don’t know what your reaction to this story will be, gentle blog reader (OK, just a little snark), but even 30 years later I cannot believe it actually took place.  Here goes.
Again, this involves a software company using a 4GL (yes, the same company), and this time I should note that one of its products was a database.  At the time of which I write, the company decided that the accompanying data dictionary needed a major refresh, and the person who had been upgrading the product all along moved on to other things, while a mostly-new development team was brought in to create that plus a new development tool for the 4GL – and one more person, crossing over from sales and assuming the position of overall development manager.  I believe that she apparently had no development or development management experience, but some first-line sales managerial experience.
At this point, I think I should make one thing clear.  As I sometimes put it, “The worst development manager I had was a woman; and the best development manager I ever had was a woman.”  Someday I hope to write about why I feel that Leslie Turek, then of Computer Corp. of America, was and is my beau ideal of a development manager, even better than several superb managers I had during my development career.  Even though, to me, the manager mentioned in this story is the worst development manager I had, I don’t believe that most of the problems were her fault.  She was, simply, wrongly put by the company into a situation in which because of her lack of previous experience it was practically impossible for her to avoid managing development poorly.
Anyway, so here you are, tasked with managing multiple developers, with no experience or software programming knowledge as a guide, and with your first task being to create a design spec for the new product in seven weeks.  What do you do?  Well, this manager’s idea was to sit the programmers down in a meeting and have them discuss it, then reflect that discussion in an agreed-on design spec.  As part of the team, although I was in parallel creating a design spec for the development tool, I was expected to attend and contribute.
Sounds plausible, yes?  Except that what it turned out to mean was seven consecutive weeks of meetings from 9 to 4:30, Monday through Friday, with ½ hour break for lunch.  And don’t forget that this was our manager running the meetings.  Any sign of deviation from the consensus as she saw it was delaying the completion of the spec and a black mark against you.  Any undue sign of impatience or inattention due to the exhausting nature of sitting there concentrating on very fuzzy stuff for hours was a black mark against you.  And those who, like me, sometimes had limits on how much into the night we could work in order to write down the day’s production into an evolving design spec (in my case, because I had to get home to help with an autistic son) got black marks.
Here’s an example of how the process worked – or rather, failed to work.  At the point where we had to decide on the data dictionary’s user interface, the manager happened across someone who had begun to do academic-style research on user interfaces, and had discovered that in menus, if there were more than seven choices, users would forget about some of the choices.  This immediately got translated in the manager’s mind into “we should use menus” as well as “menus should have no more than seven choices.”  We then spent two hours generating off the top of our heads menu choices for a particular screen that “we really should have.”  There was no consideration of whether the choices could be implemented given the information that the data dictionary would have; there was no discussion of how readable the menu choices we generated were; as often happens in this sort of idea generation, we wound up with 10 menu choices instead of 7; and there was not even any consideration of making the choices involve action verbs or otherwise making the differences between the choices apparent to the reader.  No; what mattered was that there was a great deal of detail provided by all of these menu choices, and by sheer weight of detail this would look great in a design spec to the untutored eye.
I spent much of that two hours (and the rest of the meetings) screaming in silent designer agony.  I had at the same time created in my mind and noted simply on paper a design for the development tool that had a user interface with typically three choices per screen, very simply and clearly put – what I have called in other posts “orthogonal.”  I had little time to flesh out my own design spec because of these meetings.  And raising a point about anything I have listed above would require objecting at the end of the discussion – because only then would it become clear that there would be no discussion of these points.  And if I did that, then I would get a black mark for going against the consensus as the manager viewed it and delaying the spec creation process uselessly.
The rest of the story, really, is anticlimax.  At the end of the seven weeks, the manager decided that because of the lack of detail in my design spec, I should be cut loose.  I stayed on to help another programmer to complete the development tool that I had designed, a process that we completed in 2 ½ months despite the lack of “detail.”  I was told later that the development tool was widely implemented and well received by the customer base.  Meanwhile, as I was also told later, the data dictionary kept missing deadlines.  Finally, after about nine months of development, the programmers under my manager revolted and literally refused to work for her any longer.  I am not sure whether some form of the data dictionary that this design by meeting specified was ever implemented, but it was certainly not the full design.
The moral of the story, to me, is simple.  If one tries to design by meeting, and by nothing but that, in the long run (say, a year from now), the careers of all involved in the company are DEAD.


What really strikes me, looking back all these years later on these (to me) horrible development-process experiences, was how few of them there actually were.  Perhaps I’m an old softie, inclined to sugar-coat my younger life – naah.  Or perhaps those who contributed comments to the blog post Charles sent me really are experiencing a greater frequency of lousy development experiences.  Somehow I doubt that, as well.  In my early years, especially, managers were just beginning to realize that there was more to software-development management than there was to managing, say, the accounting arm.  I know there were a lot of clueless and therefore paranoid managers out there; I think I was just lucky enough to run into fewer such experiences. 
But it still puzzles me that no one out there has mentioned the kind of experiences I had.  Maybe that is a hopeful sign.  Maybe, in the words of the Latin teacher in Rudyard Kipling’s Stalky & Co., “you see, even among the barbarians – some of it sticks”.  “Amen,” said the chaplain.  “Go to bed.”