Monday, November 30, 2015

Climate Change Update: The Fundamental Things Still Apply, and Honest Cost-Benefit Analyses Are Dangerously Flawed


It has been hard to find a good reason to post an update on climate change, although superficially there is a lot of relatively good news.  Climate change is now an acceptable part of TV “nature” documentaries, to which so many are addicted; for the first time, China has committed to less coal use and has delivered in a measurable fashion; there is the outline of a global plan for carbon-emission reduction as we head into the latest climate-change summit; and even the rhetoric of the Republican party has allowed for a candidate (Kasich) who admits that serious climate change is happening and something needs to be done about it.  And then there is Pres. Obama, who is the first major political figure afaik who has admitted that we not only need to cut back on carbon emissions but also keep a large chunk of our remaining fossil fuels in the ground indefinitely.

The Fundamental Bad News Still Applies

So why do I feel that these are not significant enough to discuss in detail?  For this simple reason:  as far as we can tell, the rise in atmospheric carbon continues not just in a straight (rising) line, but in a slow acceleration.  To put it another way, if atmospheric carbon simply kept increasing at its present rate of about 2.5 ppm per year, by 2100 it would reach about 625 ppm, corresponding to (as per Hansen) to a 4.4 degrees C or 7.6 degrees F global temperature rise.  If, however, it continues to accelerate at its present rate, according to one estimate it will reach 920 ppm by 2100, baking in a whopping 7.5 degrees C or almost 14 degrees F increase.  When I say “baked in”, I mean that we may not see that amount of temperature increase in 2100, but in the 30-50 years after 2100, much of that temperature increase will show up.
Meanwhile, the climate change this year is unfortunately proceeding as seemed likely 5 years ago.  The El Nino that causes temperature spikes was delayed, but as a result is now almost certain to be the strongest on record, causing an inevitable huge new high in global land temperatures (last year was the previous record) of about 2/3 of a degree F above the old record.  Moreover, it now seems that the El Nino will continue for quite a few months next year, almost guaranteeing a 2016 global land temperature significantly above 2015. It would not be surprising if 2015 plus 2016 totaled a full 1 degree F jump. 
And finally, the “strange” rebound in Arctic sea ice volume after 2012 is clearly over, and our best prognostication suggests that 2016 volume at minimum will be in second place after 2012 – suggesting that at least a linear reduction in volume over the last 40 years continues.  As a result, Greenland’s glaciers continue collapsing and should contribute several feet to sea level rise this century.  Some forecasts even contemplate a 26-foot rise from all sources (Greenland and Antarctica, primarily) by 2100, although more sober analyses still suggest somewhere between 6 and 16 feet.
In other words, the fundamentals of human-caused climate change continue to apply, however we may delude ourselves that our measures up to now have had a significant impact.  Like Alice in The Looking Glass, we will probably have to run twice as fast to get anywhere, and then four times, and then ...

Honest Cost-Benefit Analysis Continues to be Flawed

To my mind, the only really good news, if good news there be, is that I am beginning to see honest cost-benefit analyses – the analyses that potentially really try to face the costs and benefits of the mitigation required to do something significant about climate change.  For example, one blog post noted that most analyses failed to reflect people’s difficulties in moving when climate change or climate change mitigation requires it (meaning that a few do).  It has been absurd, watching commentators assuming that loss of 50-90% of present arable land and the necessary water translate easily into new but temporary growing spots, with the costs of moving to those locations vanishing by the magic of the so-called free market.
However, an analysis published in MIT’s Technology Review identified one barrier that, in the author’s mind, doomed any near-term conversion to solar energy:  Faced with utilities’ resistance to meshing the existing electrical grid with individual solar installations, homeowners are faced with a large installation cost that make solar uncompetitive in the home in the near to medium term.
I don’t think the author’s cost-benefit analysis – for that is what it boiled down to – is obviously flawed.  But I do believe that it suffers from several key flaws specific to climate change:
1.       It assumes a certain infrastructure (the existing grid) without considering the ways in which that grid will become comparatively more and more costly, despite temporary fixes, as changes in climate make some locations so hot that air conditioning costs shoot through the roof, some locations underwater or damaged by storms, and some locations with a different mix of heat and cooling that the grid was designed for.  A solar arrangement is not affected as much by this, because it is necessarily more distributed, can de-novo provide better architecture-based earth-derived heating and cooling, and involves less sunk-cost coal/oil/gas storage for heating.

2.       It fails to handle the “disaster scenario” in which the benefits of present cheaper energy fail to outweigh the future costs of disaster caused by everyone coming to the same don’t-change conclusion.  In other words, each decision analyzed is not a one-off nor is it isolated – most if not all people will come to the same conclusion and act the same way, and that is the situation that must be modeled.  If we all conclude that sea-level property will be fine for 40 years and can be sold thereafter to a greater fool, in a global context we quickly run out of fools, and then we can sell to nobody.

3.       It assumes governments will be able to act as the backstop/insurer of last resort.  To put it another way, in the typical situation, if businesses fail, governments are expected to handle a portion of the costs of bankruptcies (or to backstop those who do, as in the case of AIG), to provide unemployment benefits so that a pool of labor remains, and to support repair of “common” infrastructure such as roads and heating/cooling when businesses can’t.  However, when the effects are close to simultaneous and truly global, most governments are hard put to come up with the necessary support.  This, in turn, creates chaos that makes the next (and greater) crisis harder to handle.  Effectively, there is a point beyond which all countries are under such stress that despite reallocation of business investment, economies start shrinking and the ability to handle the next stress becomes less and less.  Some estimates put that point as early as the 2060s, if we continue as we are going.  So the cost of each individual decision that affects carbon pollution mitigation should be factored into a cost-benefit analysis, as well.

Recommendation:  Face The Facts

In this situation, I am reminded by an episode in fantasy author Stephen Donaldson’s first series.  He posited a leper placed into a world in which an evil, powerful character seeks to turn a wonderfully healthy world into a reflection of the leper’s symptoms:  rotting, smelling, causing numbness – and a prophecy says that only the leper can save this world.  The leper befriends a “High Lord” dedicated to fighting the evil character, and says, essentially, “Look, you’re trying you’re best but you’re failing.  Face facts!”  The High Lord, hearing this from the one person who can save his world, says, very carefully, “You have a great respect for facts.”  “I hate facts,” was the passionate response, “They’re all I’ve got.”
The point, as I see it, is not to look a grim outlook in the face and give up.  It is, rather, in everything done about climate change to understand how a particular effort falls short and how even grimmer forecasts should be factored into the next effort.  It is understanding that even as we fail to avoid the consequences of the first 4 degrees C of global warming, we redouble our efforts to avoid the next 4 degrees.  It is cutting away the non-essentials of curbing population growth and of dealing with immediate crises such as the Paris bombings, and seeing that in part these are manifestations of stresses in society that global warming is exacerbating, and therefore the primary focus should be solar now now now, or a reasonable equivalent.  And it is performing, individually and collectively, the kinds of honest cost-benefit analyses that will confront us with facts that we might not want to face, but which will lead us as quickly as possible in the right direction.
I hate climate change facts.  They’re the closest thing to hope I have.  Season’s greetings.

Monday, November 23, 2015

IBM Acquires Weather Company IT: Into the Data Unknown


The recent IBM acquisition of much of The Weather Company (TWC) – effectively, everything but The Weather Channel – is an odd duck.  I can’t remember anything quite like it in the computer industry before now.  Indeed, I suspect that IBM does not yet fully understand where in the search for better analytics via Big Data this new addition will take it.  To put it another way, IBM is stepping into an area where the ultimate use of the acquired data is to a surprising extent unknown.

The Potential Benefits of Launching Into the Data Unknown

As I noted in a previous piece about the initial IBM-TWC partnership, some core benefits are easy to see, and the acquisition simply extends them.  The infusion of analytics with weather data allows fine-tuning of customer and supplier behavior analysis and prediction – if it’s raining cats and dogs, the retail store may see more or less traffic than on a sunny day, and extreme weather will inevitably lead to decreased sales and deliveries.  Of equal long-term potential is the use of climate change science to avoid the “if this goes on” approach to weather forecasting that becomes increasingly unable to anticipate both secular year-round warming and increased occurrence of extreme events.  By taking over these functions, IBM allows its customers to drive weather-data acquisition as well as analytics in directions that a TWC-owned platform would likely not have done.
However, according to its presentation of the acquisition, IBM also views the TWC platform as a proven approach for leveraging Big Data in general for new insights, extending to the sensor-driven Web aka the Internet of Things (IoT) the innovative “cognitive” insights of IBM Watson.  I would argue, in fact that the TWC IT acquisition adds less – and more.  Less, because TWC is not on the cutting edge of the Fast Data “new technology wave” that I have written about previously.  More, because if and when it is combined with the raw “real-time sensor” data that governments collect about the weather and related location information (e.g., habitations, flora, fauna, and topography), it provides a simple “location state” of the individual customer or car or delivery truck that can allow as-yet-undefined risk reduction (avoiding the high-water spots in the road), “state of mind” assessment for selling purposes, and  services (rerouting based on likely washouts).

An Odd Way to Slice a Duck

Certain parts of the TWC acquisition put IBM effectively into areas that are not necessarily a good long-term fit – goodness of fit will depend on IBM’s execution of the acquisition. 
For one thing, acquiring the TWC web sites makes IBM effectively a media company.  Hurricane Sandy, in particular, created large followings for the blogs of certain weather forecasters.  While IBM can pretend that these are merely avocations for TWC personnel focused on The Weather Channel, the fact is that they represent an important news and analysis source for a significant segment of the public.  Whatever IBM does with the link between TWC content suppliers and the web sites will indicate whether IBM is going to run the media part of its slice of TWC into the ground or allow the weather forecasters and analysts that remain with TWC to continue leveraging their clout as experts into eyeballs and advertising revenue. 
And the question of who goes where brings up another area where IBM is venturing into new and not necessarily compatible territory:  weather and climate expertise.  As a recent IBM presentation citing weather data’s usefulness to insurance companies shows, applications to mobile car users that can reduce customer risks (avoiding or sitting out a storm) certainly have an upside to them, but what is not clear is whether the insurance company can use such a mobile app frequently enough to reduce overall customer risk substantially (and thereby increase profit margins). 
An obvious case where weather warnings may have an impact on customer risk profiles is a “black swan”, a seemingly unlikely event such as Hurricane Sandy.  However, in that case individual customer advice is likely to have only minor effect – rather, the weather experts must recognize the need for and drive proactive, outside-the-company warnings that ensure lots of customers take care.  TWC has such experts – but where is the place for them in IBM’s new weather-related organization?  Perhaps Watson will have such “domain knowledge” in the future; but it does not have it today, and so “weather analytics” badly needs such expertise.  And then weather analytics needs climate domain knowledge and climate change expertise, else predictive analytics applied to raw weather data will be far too frequently wrong.

The User Bottom Line:  Going Into the Data Unknown Is Good

I have dwelt on my concerns about IBM’s ability to get the most out of its new oddly sliced duck.  The fact remains, however, that, like Watson, a massive chunk of weather data is important not for its immediate applications, but for the ways in which users will drive new types of insights with it.  Just as Watson has spawned new “cognitive” ways to analyze data, driven by user use cases, so the early adopters of weather data will generate use cases that will suggest other applications that we cannot foresee.  The savvy user, therefore, will pick the right time and the right use case to begin to use weather data – and IBM’s TWC capabilities are an excellent place to start.
And if IBM figures out how to leverage TWC effectively in the areas that I have cited, so much the better for the user.  That the weather data will also lead IBM to get even more serious about climate change, I suppose, is too much to hope.  But hope I will.