Wednesday, September 21, 2016

August 16th, 2070: Rising Waters Flood Harvard Cambridge Campus, Harvard Calls For More Study of Problem


A freak nor’easter hit the Boston area yesterday, causing a 15-foot storm surge that overtopped the already precarious walls near the Larz Anderson Bridge.  Along with minor ancillary effects such as the destruction of much of Back Bay, the Boston Financial District, and Cambridgeport, the rampaging flood destroyed the now mostly-unused eastern Harvard Business School and Medical School, as well as the eastern Eliot and Lowell Houses.  The indigent students now housed in Mather House can only reach the dorms in floors 5 and above, by boat, although according to Harvard President Mitt Romney III the weakened structural supports should last at least until the end of the next school year.
A petition was hastily assembled by student protest groups last night, and delivered to President Romney at the western Harvard campus at Mt. Wachusett around midnight, asking for immediate mobilization of school resources to fight climate change, full conversion to solar, and full divestment from fossil-fuel companies. President Romney, whose salary was recently raised to $20 million due to his success in increasing the Harvard endowment by 10% over the last two years, immediately issued a press release stating that he would gather “the best and brightest” among the faculty and administration to do an in-depth study and five-year plan for responding to these developments.  Speaking from the David Koch Memorial Administrative Center, he cautioned that human-caused climate change remained controversial, especially among alumni.  He was seconded by the head of the Medical School, speaking from the David Koch Center for the Study of Migrating Tropical Diseases, as well as the head of the Business School, speaking from the David Koch Free-Market Economics Center. 
President Romney also noted that too controversial a stance might alienate big donors such as the Koch heirs, which in turn allowed indigent students to afford the $100,000 yearly tuition and fees.  He pointed out that as a result of recent necessary tuition increases, and the decrease in the number of students able to afford them from China and India due to the global economic downturn, the endowment was likely to be under stress already, and that any further alienation by alumni might mean a further decrease in the number of non-paying students.  Finally, he noted the temporary difficulties caused by payment of a $50 million severance package for departing President Fiorina.
Asked for a comment early this morning, David Koch professor of environmental science Andrew Wanker said, “Reconfiguring the campus to use solar rather than oil shale is likely to be a slow process, and according to figures released by the Winifred Koch Company, which supplies our present heating and cooling systems, we have a minimal impact on overall CO2 emissions and conversion will be extremely expensive.”  David Koch professor of climate science Jennifer Clinton said, “It’s all too controversial to even try to tackle.  It’s such a relief to consider predictions of further sea rise, instead.  There, I am proud to say, we have clearly established that the rest of the eastern Harvard campus will be underwater sometime within the next 100 years.”
In an unrelated story, US President for Life Donald Trump III, asked to comment on the flooding of the Boston area, stated “Who really cares?  I mean, these universities are full of immigrant terrorists anyway, am I right?  We’ll just deport them, as soon as I bother to get out of bed.”

Monday, September 19, 2016

HTAP: An Important And Useful New Acronym


Earlier this year, participants at the second In-Memory Summit frequently referred to a new marketing term for data processing in the new architectures:  HTAP, or Hybrid Transactional-Analytical Processing.  That is, “transactional” (typically update-heavy) and “analytical” (typically read-heavy) handling of user requests are thought of as loosely coupled, with each database engine somewhat optimized for cross-node, networked operations. 

Now, in the past I have been extremely skeptical of such marketing-driven “new acronym coinage,” as it has typically had underappreciated negative consequences.  There was, for example, the change from “database management system” to “database”, which has caused unending confusion about when one is referring to the system that manages and gives access to the data, and when one is referring to the store of data being accessed.  Likewise, the PC notion of “desktop” has meant that most end users assume that information stored on a PC is just a bunch of files scattered across the top of a desk – even “file cabinet” would be better at getting end users to organize their personal data.  So what do I think about this latest distortion of the previous meaning of “transactional” and “analytical”?

Actually, I’m for it.

Using an Acronym to Drive Database Technology


I like the term for two reasons:

1.       It frees us from confusing and outdated terminology, and

2.       It points us in the direction that database technology should be heading in the near future.

Let’s take the term “transactional”.  Originally, most database operations were heavy on the updates and corresponded to a business transaction that changed the “state” of the business:  a product sale, for example, reflected in the general ledger of business accounting. However, in the early 1990s, pioneers such as Red Brick Warehouse realized that there was a place for databases that specialized in “read” operations, and that functional area corresponded to “rolling up” and publishing financials, or “reporting”.  In the late 1990s, analyzing that reporting data and detecting problems were added to the functions of this separate “read-only” area, resulting in Business Intelligence, or BI (similar to military intelligence) suites with a read-only database at the bottom.  Finally, in the early 2000s, the whole function of digging into the data for insights – “analytics” – expanded in importance to form a separate area that soon came to dominate the “reporting” side of BI. 

So now let’s review the terminology before HTAP.  “Transaction” still meant “an operation on a database,” whether its aim was to record a business transaction, report on business financials, or dig into the data for insights – even though the latter two had little to do with business transactions.  “Analytical”, likewise, referred not to monthly reports but to data-architect data mining – even though those who read quarterly reports were effectively doing an analytical process.  In other words, the old words had pretty much ceased to describe what data processing is really doing these days.

But where the old terminology really falls down is in talking about sensor-driven data processing, such as in the Internet of Things.  There, large quantities of data must be ingested via updates in “almost real time”, and this is a very separate function from the “quick analytics” that must then be performed to figure out what to do about the car in the next lane that is veering toward one, as well as the deeper, less hurried analytics that allows the IoT to do better next time or adapt to changes in traffic patterns.

In HTAP, transactional means “update-heavy”, in the sense of both a business transaction and a sensor feed.  Analytical means not only “read-heavy” but also gaining insight into the data quickly as well as over the long term.  Analytical and transactional, in their new meanings, correspond to both the way data processing is operating right now and the way it will need to operate as Fast Data continues to gain tasks in connection to the IoT.

But there is also the word “hybrid” – and here is a valuable way of thinking about moving IT data processing forward to meet the needs of Fast Data and the IoT.  Present transactional systems operating as a “periodic dump” to a conceptually very separate data warehouse simply are too disconnected from analytical ones.  To deliver rapid analytics for rapid response, users also need “edge analytics” done by a database engine that coordinates with the “edge” transactional system.  Transactional and analytical systems cannot operate in lockstep as part of one engine, because we cannot wait as each technological advance in the transactional side waits for a new revision of the analytical side, or vice versa.  HTAP tells us that we are aiming for a hybrid system, because only that has the flexibility and functionality to handle both Big Data and Fast Data.

The Bottom Line


I would suggest that IT shops looking to take next steps in IoT or Fast Data try adopting the HTAP mindset.  This would involve asking oneself:

·         To what degree does my IT support both transactional and analytical processing by the new definition, and how clearly separable are they?

·         Does my system for IoT involve separate analytics and operational functions, or loosely-coupled ones (rarely today does it involve “one database fits all”)?

·         How well does my IT presently support “rapid analytics” to complement my sensor-driven analytical system?

If your answer to all three questions puts you in sync with HTAP, congratulations:  you are ahead of the curve.  If, as I expect, in most cases the answers reveal areas for improvement, those improvements should be at a part of IoT efforts, rather than trying to patch the old system a little to meet today’s IoT need.  Think HTAP, and recognize the road ahead.

Saturday, September 17, 2016

The Climate-Change Dog That Did Bark In The Night: CO2 Continues Its Unprecedented 6-Month Streak


In one of Conan Doyle’s Sherlock Holmes mysteries, an apparent theft of a racehorse is solved when Holmes notes “the curious incident of the dog in the night” – the point being that the guard dog for the stable did not bark, showing that the only visitor was known and trusted.  In some sense, CO2 is a “guard dog” for oncoming climate change, signaling future global warming when its increases overwhelm the natural Milankovitch and other climate cycles.  It is therefore distressing to note that in the last 6 months, the dog has barked very loudly indeed:  CO2 in the atmosphere has increased at an unprecedented rate. 

And this is occurring in the “nighttime”, i.e., at a time when, by all our measures, CO2 emissions growth should be flat or slowing down.  As noted in previous posts, efforts to cut emissions, notably in the EU and China, plus the surge of the solar industry, have seemed to lend credibility to metrics of carbon emissions from various sources that suggest more or less flat global emissions in 2014 and 2015 despite significant global economic and population growth.

What is going on?  I have already noted the possibility that a major el Nino event, such as occurred in 1998, can cause a temporary surge in CO2 growth.  In 1998, indeed, CO2 growth set a record that was not beaten until last year, but in the two years after 1998, CO2 atmospheric ppm growth fell back sharply to nearly the previous level.  By our measures, the el Nino occurring in the first 5 months or so of 2016 was about equal in magnitude to the one in 1998, so one would expect to see a similar short surge.  However, we are almost 4 months past the end of this el Nino, and there is very little sign of any major decrease in growth rate.  It already appears certain that we cannot dismiss the CO2 surge as a short-term blip.

Recent Developments in CO2 Mauna Loa


In the last few days, I was privileged to watch the video of Prof. John Sterman of MIT, talking about the “what-if” tool he had developed and made available in which climate models drive CO2 emissions growth depending on how aggressive the national targets are for emissions reduction.  He was blunt in saying that even the commitments coming out of the Paris meeting are grossly inadequate, but he did show how much more aggressive targets could indeed keep total growth at 2 degrees C.  In fact, he was so forthright and well-informed that I could finally hope that MIT’s climate-change legacy would not be the government-crippling misinformation of that narcissistic hack Prof. Lindzen.

However, two of his statements – somewhat true in 2015 but clearly not true at this point in 2016 (the lecture, I believe, was given in the spring of 2016) – stick in my head.  First, he said that we are beginning to approach 1.5 degrees C growth in global land temperature.  According to the latest figures cited by Joe Romm, the most likely global land temperature for 2015 will be approximately 1.5 degrees C.  Second, he said that CO2 (average per year) had reached the 400 ppm level – a statement true at this time last year.  As of April-July 2016, however, the average per year has reached between 404 and 405 ppm.

CO2 as measured at the Mauna Loa observatory tends to follow a seasonal cycle, with the peak occurring in April and May, and the trough in September.  In the last few years, at all times of the year, growth year-to-year (measured monthly) averaged slightly more than 2 ppm.  Note that this was true for both 2014 and most of 2015.  Then, around the time that the el Nino arrived, it rose to 3 ppm.  But it didn’t stop there:  In April, a breathtakingly sharp rise of 4.1 ppm took CO2 up to 408 ppm.  And it didn’t stop there:  May and June were likewise near 4 ppm, and the resulting total average rise through August has been almost 3.6 ppm.  September so far continues to be in the 3.3-3.5 range. 

CO2, el Nino, Global Land Temperature:  What Causes What?


Let’s add another factoid.  Over the last 16 or so months, each month’s global land temperature has set a new record.  In fact, July and August (July is typically the hottest month) tied for absolute heat record ever recorded, well ahead of the records set last year.

So here we have three factors:  CO2, el Nino, and variations in global land temperature.  Clearly, in this heat wave “surge”, the land temperature started spiking first, the full force of el Nino arrived second, and the full surge in CO2 arrived third.  On the other hand, we know that atmospheric CO2 is not only a “guard dog”, but also, in James Hansen’s phrase, a “control knob”:  in the long term, for large enough variations in CO2 (which can be 10 ppm in some cases), the global land temperature will eventually follow CO2 by rising or falling in proportion.  Moreover, it seems likely that el Nino’s short-term effect on CO2 must be primarily by raising the land temperature, which does things like expose black carbon on melting ice for release to the atmosphere, or increase the imbalance between carbon-absorbing forest growth and carbon-emitting forest fires by increasing the incidence of forest fires.

But I think we also have to ask whether the effect of increasing CO2 on global temperatures (land plus sea, this time) begins over a shorter time frame than we thought.  The shortest time frame for a CO2 effect suggested by conservative science is perhaps 2000 years, when a spike in CO2 caused Arctic melting indicative of global warming less than a million years ago.  Hansen and others, as well, have identified 360 ppm of atmospheric CO2 as the level at which Arctic sea ice melts out, and we only passed that level about 20 years ago – a paper by a Harvard professor projects that Arctic sea ice will melt out at minimum somewhere between 2032 and 2053.  In other words, we at least have some indication that CO2 can affect global temperature in 50-100 years or so.

And finally, we have scientific work showing that global land temperature increases that melt Arctic sea and land ice affect albedo (e.g., turn white ice into blue water), which in turn increases sea and land heat absorption and hence temperatures, and these changes “ripple down” to the average temperatures of temperate and subtropical zones.  So part of global land temperature increase is caused by – global land temperature increase.  It is for these reasons that many scientists feel that climate models underestimate the net effect of a doubling of CO2, and these scientists estimate that rather than 500 ppm leading to a 2 degree C temperature increase, it will lead to a 4 degree C increase.

I would summarize by saying that while it seems we don’t know enough about the relationship between CO2, el Nino, and global land temperature, it does seem likely that today’s CO2 increase is much more than can be explained by el Nino plus land temperature rise, and that the effects of this CO2 spike will be felt sooner than we think.

Implications


If the “guard dog” of CO2 in the atmosphere is now barking so loudly, why did we not anticipate this?  I still cannot see an adequate explanation that does not include the likelihood that our metrics of our carbon emissions are not capturing an increasing proportion of what we put into the air.  That certainly needs looking into.

At the same time, I believe that we need to recognize the possibility that this is not a “developing nations get hit hard, developed ones get by” or “the rich escape the worst effects” story.  If things are happening faster than we expect and may result in temperature changes higher than we expect, then it is reasonable to assume that the “trickle-up” effects of climate change may become a flood in the next few decades, as rapid ecosystem, food, and water degradation starts affecting the livability of developed nations and their ability to feed their own reasonably-well-off citizens. 

The guard dog barks in the night-time, while we sleep.  We have a visitor that is not usual, and should not be trusted or ignored.  I urge that we start paying attention. 

Wednesday, September 14, 2016

In Which We Return to Arctic Sea Ice Decline and Find It Was As Bad As We Thought Seven Years Ago

For the last 3 years or so, I have rarely blogged about Arctic sea ice, because my model of how it worked seemed flawed and yet replacement models did not satisfy. 

Until 2013, measures of Arctic sea ice volume, trended downward, exponentially rather than linearly, at minimum (usually in early September).  Then, in 2013, 2014, and 2015, volume, area and extent measures seemed to rebound above 2012 almost to the levels of 2007, the first “alarm” year.  My model, which was of a cube of ice floating in water and slowly moving from one side of a glass to the other, with seasonal heat increasing yearly applied at the top, bottom, and sides, simply did not seem to reflect what was going on. 

And now comes 2016 (the year’s melting is effectively over), and it seems clear that the underlying trends remain, and even that a modified version of my model loosely fits what’s happening.  This has been an unprecedented Arctic sea ice melting year in many ways – and the strangest thing of all may be this glimpse of the familiar.

Nothing of It But Doth Change, Into Something Strange

The line is from early in Shakespeare’s Richard III, in which a character talks of his father’s drowning:  “Full fathom five my father lies/Of his bones are coral made/ … /Nothing of him but doth change,/Into something rich, and strange.”  And, indeed, the changes in this year’s Arctic sea ice saga deserve the title “sea-change.”  Here’s a list:
  1. 1.       Lowest sea-ice maximum, by a significant amount, back in late March/April.
  2. 2.       Lowest 12-month average.
  3. 3.       Unprecedented amount of major storm activity in August.
  4. 4.       Possibly greatest amount of melt during July for years when generally cloudy conditions hinder melting.
  5. 5.       First sighting of a large “lake” of open water at the North Pole on Aug. 28, when two icebreakers parked next to an ice floe and one took a picture.  Santa wept.
  6. 6.       First time since I have been monitoring Arctic sea ice melt when the Beaufort Sea (north of Canada and Alaska) was almost completely devoid of ice to within 5 degrees of the pole.


These events actually describe a coherent story, as I understand it.  It begins in early winter, when unusual ocean heat shows up at the edges of the ice pack, especially around Norway.  The ocean temperature (especially in the North Atlantic) has been slowly heating over time, but this time it seemed to cross a threshold:  parts of the Atlantic near Norway stayed ice free all the way through the year, leading to the lowest-ever Arctic-ocean maximum.

In May and early June, the sun was above the horizon but temperatures were still significantly below freezing in the Arctic, so relatively little melting got done. Then, when cloudy conditions descended and stayed late in June or thereabouts, the relative ocean heat counteracted some of the loss of melting energy from the sun.  But another factor emerged:  the thinness of the ice.  Over 2013, 2014, and 2015, despite the lack of summer melt, multi-year ice remained a small fraction of the whole.  So when a certain amount of melt occurred in July and August, water “punched through” in many places, so that melting was occurring not just on the top (air temperature) and bottom (ocean heat) but also the sides (ocean heat) of ice floes.  And this Swiss cheese effect was happening not just at the periphery, but all over the central Arctic – hence the open water at the Pole.

Then came the storms of August.  In previous years, at any time of the year, the cloudiness caused by storms counteracted their heat energy.  This year, the thin, broken ice was driven by waves that packed some of the storms’ energy, further melting them.  And, of course, one of the side effects was to drive sea ice completely out of the Beaufort Sea.

Implications For The Future

It is really hard to find good news in this year’s Arctic sea ice melting season.  Years 2013-2015 were years of false hope, in which although it seemed that although eventually Arctic sea ice must reach zero at minimum (defined as less than 1% of the Arctic Ocean covered with ice), we had reached a period of flat sea ice volume, which only a major disturbance such as abundant sunshine in July and August could tip into a new period of decline. 

However, the fact of 2016 volume decrease in such unpromising weather conditions has pretty much put paid to those hopes.  It is hard to see what can stop continued volume decreases, since neither clouds nor storms will apparently do so any longer.  One can argue that the recent el Nino artificially boosted ocean temperatures, although it is not clear how it could have such a strong relative effect; but there is no sign that ocean heat will return to 2013-2015 levels now that the el Nino is over.  

Instead, the best we can apparently hope for is a year or two of flatness at the present volume levels if such an el Nino effect exists, not a return to 2013-2015 levels.  My original off-the-cuff projection “if this [volume decreases 2000-2012] goes on” was for zero Arctic sea ice around 2018, and while I agree that the 2013-2015 “pause” makes 2018 very unlikely, 2016 also seems to make any “zero date” after 2030 less likely than zero Arctic sea ice at some point in the 2020s. 

A second conclusion I draw is that my old model, while far overestimating the effects of “bottom heating” pre-2016, now works much better in the “fragmented ice” state of today’s Arctic sea ice in July and August.  In this model, as ice volume approaches zero at minimum, volume flattens out, while extent decreases rapidly and area more rapidly still (unless the ice is compacted by storms, as occurred this year).  This effect will be unclear to some extent, as present measurement instruments can’t distinguish between “melt ponds” caused by the sun’s heat and actual open water.

Finally, the Arctic sea “ice plug” that slowed Greenland glacier melt by pushing back against glacier sea outlets continues to appear less and less powerful.  This year, almost the entire west coast of Greenland was clear of ice by early to mid June – a situation I cannot recall ever happening while I’ve been watching.  Since this speeds glacier flow and therefore melting at its terminus entering the sea, it appears that this decade, like the 1990s and 2000s, will show a doubling of Greenland snow/ice melt.  James Hansen’s model, which assumes this will continue for at least another couple of decades, projects 6-10 feet of sea rise by 2100.  And even this may be optimistic – as I hope to discuss in a follow-on post on 2016’s sudden CO2 rise.

Most sobering of all, in one sense we are already as near to zero Arctic sea ice as makes no difference.  Think of my model of an ice cube floating in water in a glass for a minute, and imagine that instead of a cube you see lots of thin splinters of ice.  You know that it will take very little for that ice to vanish, whereas if the same volume of ice were still concentrated in one cube it will take much more.  By what I hear of on-the-ground reports, much of the remaining ice in the Arctic right now is those thin chunks of ice floating in water. 

“Because I do not hope to turn again/Because I do not hope/ … May the judgment not be too heavy on us/ … Teach us to care and not to care.”  T.S. Eliot, Ash Wednesday