Saturday, December 15, 2018

Climate Change Fall 2018: Postscript to Addendum


Disclaimer:  I am now retired, and am therefore no longer an expert on anything.  This blog post presents only my opinions, and anything in it should not be relied on.

Two new factoids:

1.        The CO2 data from Mauna Loa are now showing that CO2 levels (averaged over the last ½ year plus a projection of the next six months) reached 410 ppm in Nov.  This date is a little more than three years since that measure reached 400.

2.       The estimate of carbon emissions – flat for years 2014-2016 – rose by 1.6% in 2017 and is projected to rise by 2.7% in 2018.  Primary increases were from China and India, but the US also rose – only Europe among major contributors decreased.  Although, as I have noted, this measure may well be flawed as an indicator of underlying carbon emissions rise, the very fact that it can now be monitored on a monthly basis suggests that some of the flaws have been worked out.  It is, therefore, less likely to be an underestimate of carbon emissions, and hence the rate of rise is more likely to be correct or a slight overestimate.

Let me reiterate the conclusion in my Oct. addendum more forcefully:  I am told that I have, on average, 8 ½ years more to live.  By the time I am dead, CO2 seems all but certain to reach 430 ppm, and may well be approaching 440 ppm.  By 2050, if things simply continue linearly instead of accelerating the way they have done for the past 60 years, we will be at 500 ppm, nearly doubling CO2 at the start of the Industrial Revolution.  This bakes in a global temperature rise since then of 4 degrees Centigrade, or 7 degrees Fahrenheit in the long run, according to James Hansen and others, with at least 2 degrees C since the IR in the short run, or another 2 degrees F from the way things are right now.  
Another point:  There is a clear line between recent increases in carbon emissions and the administration of President Donald Trump.  The lack of support from that administration is clearly linked not only to US increases (via a strong rise in US oil/shale/natural gas generation) but also to decreased pressure on India and China, both in unilateral relations and in the meetings regarding implementation of the Paris Agreement.


Wednesday, October 31, 2018

Climate Change and Economics: The Invisible Hand Never Picks Up the Check


Disclaimer:  I am now retired, and am therefore no longer an expert on anything.  This blog post presents only my opinions, and anything in it should not be relied on.

Over the past few days, I have been reading Kim Stanley Robinson’s “Green Earth” trilogy, an examination of possible futures and strategies in dealing with climate change thinly disguised as science fiction.  One phrase in it struck me with especial force:  “the blind hand of the market never picks up the check.”  To put it in more economic terms:

·         Firms, and therefore market economies as a whole, typically seek profit maximization, and because the path to profit from new investment is always uncertain, to focus particularly on cost minimization within a chosen, relatively conservative profit-maximization strategy.

·         To minimize costs, they may not only use new technologies (productivity enhancement), but also offload costs as far as possible onto other firms, consumers, workers, societies, and governments.  Of these, the most difficult is offloading costs onto other firms (e.g., via supply-chain management), since these are also competing to minimize their costs and therefore to offload right back.  Therefore, especially for the large, global firms that dominate today’s markets, the name of the game is to not only minimize costs from workers, consumers (consider help desks as an example), and societies/governments, but also to get “subsidies” from these (time flexibility or overtime from workers, consumers performing more of the work of [and bearing more of the risk of] the sales transaction, governments not only providing subsidies but also things such as infrastructure support, education and training of the work force, and dealing with natural disasters – now including climate change). 

Often, especially in regard to climate change, economists may refer to the process of the invisible hand never picking up the check as the “tragedy of the commons.”  The flaw of this analysis is to limit one’s gaze implicitly to tangible property.  If one uses as a broader metric money equivalents, then it is clear that it is not just “common goods” that are being raided, but personal non-goods such as worker/consumer/neither time that translates to poorer health and less ability to cope with life’s demands, sapping productivity directly as well as via its effects on the worker/consumer’s support system, not to mention the government’s ability to compensate as it is starved of money.  And all of this still does not capture the market’s ability to “game the system” by monopolizing government and the law.
Another point also struck me when I read this phrase:  macroeconomics does not even begin to measure the amount of that “cost raiding”, instead referring to it as “externalities”.  And therefore:
Economics cannot say whether market capitalism is better than other approaches, or worse, or the same.   It cannot say anything at all on the subject.

Further Thoughts About Economics and Alternatives to Market Capitalism


A further major flaw, imho, in economics’ approach to the whole subject is the idea that cost minimization should not only be a desired end but also the major goal of an enterprise.  I am specifically thinking of the case of the agile company.  As I have mentioned before, agile software development deemphasizes cost, quality, revenue, time to market, and profit in favor of constantly building in flexibility to adjust to and anticipate the changing needs of the consumer.  And yet, agile development outperforms approaches that do concentrate on these metrics by 25% at a minimum and sometimes 100%.  
If the entire economy were based on real agile firms, I would suggest that we would see a comparable improvement in the economy – permanently.  Moreover, the focus on the consumer should lead to a diminution in “cost raiding”.  The focus on being truly in tune with the consumer’s needs, for example, should diminish raiding the consumer’s time in the sales transaction and forcing them to use the help-desk bottleneck.  And I still live in hope that agile development with fewer time constraints will empower the developer with the ability to seek out and implement his or her own tools to improve processes, thereby allowing better retraining.   

Implications of Climate Change for Economics and Market Capitalism


Robinson includes a critique of market capitalism in his work, and concludes that it has to change fundamentally.  I find the critique itself problematic; but that doesn’t mean he isn’t right in his conclusion.
The fundamental question to me is, what happens when externalities go in reverse, and suddenly the things that have led to ongoing profits lead to ongoing losses?  Robinson paints a frightening picture of a world in which brownouts, blackouts, killing cold, and killing heat are common, and insurance, whether private or governmental, cannot adequately compensate, leading additional costs to settle, inexorably, on their last resort, business.  Then, implicitly, firms must cannibalize each other, with the largest being best equipped to do so.
I tend to place things in less apocalyptic terms.  According to Prof. deLong, GDP performance can be thought of as part improvement in productivity and part expansion of the workforce.  The climate change scenario necessarily implies a shrinkage of that workforce (in labor-hours) faster than productivity can climb, and therefore a constantly shrinking market.  In that case, the market’s rising need for “cost raiding” as the market shrinks simply speeds up the shrinkage of the market – not to mention the underlying societies.  And that, to me, is the fundamental flaw that needs correcting. 
Theoretically, one option is to capture things like “the social cost of carbon” in company accounting – an idea I wrote about five years ago.  Practically speaking, the uneven effects of that on companies mean real impact on the employees of coal and oil companies, a fact we have already seen a small foretaste of, and that has further revealed the ability of oil and coal companies to entirely snarl the political process to prevent adequate steps at limiting “cost raiding” – and that makes our carbon pricing efforts in real-world terms more likely than not to be inadequate to reverse the “cost raiding” trend.  
The obvious alternative, which I and others have argued for and I in fact picked up on eight years ago when I first understood the dire implications of climate change, is “World War II in America”, governmental interference in the economy comparable to that of WWII in order to “win the war on climate change”.  Only, of course, the aim is to lose the war with as little damage as possible.  So suppose we do that; what then?
The obvious answer is, “sustainability” – meaning practices that will ensure that having “won the war”, we don’t lose it again in the future by slipping back into the old carbon-guzzling, ecology-devastating, arable-land-destroying habits.  Is that enough?  Robinson says no, that despite sustainability, cost raiding will continue to increase in other areas.  And here I tend to agree with him, although I am not sure.
It appears, reverting to Prof. deLong’s point above, that it is possible with sustainability to continue to improve both human welfare and corporate profitability, by improving productivity with a more or less stable (almost certainly shrunken) population and workforce.  However, productivity improvement may well be less than in the Industrial Revolution – it has already slowed for an unduly long time.  And if that is the case, then there is no market-capitalism path forward that involves today’s increases in corporate profitability and avoids cost raiding increases.
I don’t know the answer to this.  I feel, however, that the beginnings of an answer lie not in perpetually increasing the size of the workforce by improving human welfare, while somehow not increasing population, but rather in perpetually increasing “consumer productivity”:  the value that people get out of their lives, that they can then invest in others.  More specifically, I think markets can be divided into those for carrying out daily tasks (“Do”), those for socializing and participating in society (“Socialize”) and those for learning and creating (“Learn”).  A balance must be kept between these efforts in any individual’s life, so the perpetual increases must be achieved inside each of these three sets of markets. 
I would argue that today’s market economies use “Do” to crowd out much of the other two sets of markets, and are less good at perpetually increasing the value of “Socialize” and “Learn”, although the crowding-out may mean that “Do”’s superiority is illusory.  I have no clear idea as to what to do about my conclusions, except to examine each set of markets more closely to gain clues as to how to achieve this perpetual value increase.
Just some thoughts.  And oh, by the way, Robinson is indeed worth reading.

Tuesday, October 30, 2018

Climate Change Fall 2018: A Personal Addendum


Disclaimer:  I am now retired, and am therefore no longer an expert on anything.  This blog post presents only my opinions, and anything in it should not be relied on.
One thing I did not note in my recent climate-change update:  The CO2 data from Mauna Loa are now showing that CO2 levels (averaged over the last ½ year plus a projection of the next six months) reached 409 ppm in Sept.  This is about three years since that measure reached 400 ppm, and is less than 6 months before it reaches 410 ppm.
I am told that I have, on average, 8  ½ years more to live.  By the time I am dead, CO2 will in all likelihood have reached 430 ppm, and may well be approaching 440 ppm.  By 2050, if things simply continue linearly instead of accelerating the way they have done for the past 60 years, we will be at 500 ppm, nearly doubling CO2 at the start of the Industrial Revolution.  This bakes in a global temperature rise since then of 4 degrees Centigrade, or 7 degrees Fahrenheit in the long run, according to James Hansen and others, with at least 2 degrees C in the short run, or another 2 degrees F from the way things are right now.  
Heckuva job, humanity.

Local Markings of Climate Change These Days


I have lived in the Northeast US for all of my 68-year life, the last 40 years of it near Boston.  This year, there are so many weather changes I cannot remember ever seeing before.
It is now a day before Halloween.  For the first time ever, most of the leaves are still on the trees.  Leaf coloration only began happening in early October, the latest ever; it used to happen in mid-September. 
In late October, shortly before a playoff game was to be played in Fenway Park, there was a thunderstorm.  That has never happened in late October.  As a matter of fact, thunderstorms only used to happen around here once or twice in mid-summer – if that.  
This last summer was hot (as usual) and humid (something that has only been happening in the last 10 years.  It started in late June and went full tilt until mid-September, which it has also never done before, at a typical “the way it feels to you” pace of the upper 80s to the low 90s F.  Many days, I stayed indoors all day and night.
All year, the wind has been strong – typically 10 mph faster than even 15 years ago.  My backyard is well shielded by trees from the wind, and until the last couple of years I could look out and not see the leaves and branches moving.  This year, I typically see them moving even close to the house.
There has been a lot of rain this year.  What’s unprecedented is that most rains are hard rains, with big raindrops hammering on the roof.  Going out for a walk during a rainstorm, with wind blowing your umbrella wildly, the streets flooded an inch or three, and the wind driving the large raindrops horizontally onto your clothing, is contraindicated in most cases.  So even in the spring and fall of this year, some days I spend indoors all day and night.
And I know that from here on, on average, it all only gets worse.  100-mph nor’easter, anyone?

Thursday, October 18, 2018

Reading New Thoughts: O’Reilly’s What’s The Future, the Agile Entity, and Prediction


Disclaimer:  I am now retired, and am therefore no longer an expert on anything.  This blog post presents only my opinions, and anything in it should not be relied on.

Tim O’Reilly’s “WTF:  What’s the Future, and Why It’s Up to Us” is, alternately, an insightful memoir of many of the computer industry events that I lived with and to some extent participated in, a look at the oncoming technologies coming from the computer industry and its spinoffs, with particular over-emphasis on Uber and Lyft, and an attempt to draw general conclusions about how we all should anticipate what’s going to impact us in the future and how we should “ride the wave”.
First, I want to add a caveat that I think should become a Law:

The future happens faster than we think – and then it happens slower than we think.

By this I mean:  when new technological breakthroughs arrive, not all are obvious to a particular part of the economy that we attend to, even if (today) they are linked by software technology.  Then, even when they seem like the “new new thing” everywhere in our particular area, they typically take 10-30 years to spread to the world at large.  For example, smartphones and their apps (themselves over 10 years old) are by no means ubiquitous in the Third World, despite the hype.
I’d like to note here several instances of the future arriving “faster than we think”, some profiled in WTF.  Among the ones that I find amazing (and sometimes frightening):
·         We can now alter and replace 20-gene DNA and RNA segments, and hence genes in general, not only for the next generation but also in many cases over the course of a few months for our own.  The work to achieve that happened less than 10 years ago, practical implementation was achieved less than 5 years ago, and the Nobel Prize for that work (led by Jennifer Doudna and her team, described in her book) was awarded this month.

·         Pictures of anyone can be inserted seamlessly in a different scene, making it very hard to tell the truth of the news pictures that we see every day.

·         Understandable automated language translation (e.g., Google), automated voice recognition, and automated picture recognition have been achieved (although “good” speech recognition has still not been reached).

·         Semi-automated bots generate comments on articles and in blogs that are often indistinguishable from the ungrammatical and rambling comments of many humans.  Hence, the attempts at hacking political elections and the increasing difficulty of figuring out the truth of events in the public arena, “crowded out” as they now sometimes are by false “rumors.”

More subtly:
·         Uber and Lyft create “instant marketplaces” matching buyers and sellers of taxi services.

·         Supermarkets now dictate to growers production of specific items with detailed specification of quality and characteristics, based on narrow segments of the consumer market.

Now, let’s talk about what I think are the new thoughts generated by WTF.  In particular, I want to suggest that O’Reilly’s view of how to “predict” which oncoming technologies should be factored into one’s business, government, or personal strategy going forward, how to fit these into an overall picture, and how to use that picture to develop strategy, is really of most use in developing an “agile strategy” for an agile entity.

WTF Key Technologies and Strategies


Perhaps the best place to start is with WTF’s “Business Model of the Next Economy,” i.e. a model of the typical firm/organization in the future.  There are many subtleties in it, but it appears to break down into:
·         Central “Networked Marketplace Platforms”, i.e., distributed infrastructure software that provides the basis for one or many automated “marketplaces” in which buyers and sellers can interact.  In the supply chain, the firm would be the primary buyer; at the retail level, it would be the primary seller.

·         Feeding into these platforms, an approach that “replaces markets with information” – instead of hoarding information and using that hoarding to drive monopoly sales, the firm releases information openly and uses the commoditization of the product to drive dominance of new products.

·         Also feeding into the platforms, a new approach to user interfacing that seeks to create “magical experiences.”  This particularly enhances the firm’s and platform’s “reputation.”

·          Another “feeder” is “augmented” workers – workers enabled by rather than replaced by AI-type software.

·         A fourth “feeder” is “On-Demand” (applied flexibly, as needed) talent (workers given added value by their talents) and resources.  This includes an emphasis on actively helping workers to succeed, including over the long run.

·         A fifth feeder – somewhat complementary to the fourth – is “Alternatives to Full-Time Employment”, where the emphasis is on being flexible for the benefit of the worker, not the employer – the takeaway being that this actually benefits the employer more than WalMart-style “show up when we need you and never mind your personal life” approaches.  The key newness about this approach is that is “managed by algorithm” – the algorithm allows both the employer and employee to seek to manage their needs in a semi-automated fashion.

·         Returning to the business itself, the final feeder to the marketplace platform is “Services on Demand”, which offers to the consumer an interface that is providing an ongoing service rather than simply selling a product.  This is enhanced by “marketplace liquidity,” ways to make it easier for the consumer to buy the service.

At this point I revert to my caveat/Law in the beginning.  This “next economy” model is already operating in parts of the computer industry and related fields, e.g., Amazon, Google, Lyft – the future has already happened faster than we think.  At the same time, there will be a longer time than we think before it diffuses across the majority of organizations, if it does so at all.  Government and law are two obvious places considered in WTF where this model holds great potential, but will take a long, long time to effectively apply.
If the object of the game is to “ride the technology wave” by predicting which oncoming technologies should be factored into one’s business, then the technologies in this model are relatively safe bets.  They are already past the stage of “timing”, where the technology is attractive but it may not yet be time for the market to implement it.  As WTF points out, the trick is not to simply latch on to a strategy like this, but to constantly update the model and its details as new technologies arrive at their “timing” stage. 
Enter the agile strategy.

Prediction and the Agile Entity


The agile process is, on its face, reactive.  It does not attempt to get out ahead of the combined wisdom of developers/process-users and consumers/end-users.  Rather, it seeks to harvest that wisdom rapidly in order to get out in front of the market as a whole, and only for the purposes of each development/rollout process.
An agile strategy (which, up to this point, I haven’t examined closely) should be a different animal.  Precisely because any strategy bridges a firm/organization’s entire set of new-product-development efforts as well as aligning the rest of the organization with these, an agile strategy should be (a) long-term and (b) to a significant degree in advance of current markets.  
In the case of the strategy outlined in the previous section (i.e., implement the “new business economy model”), one very straightforward way of adding agility to the strategy would be to add agility to the software and analytics used to implement it.  One tried-and-true method for doing this is “refactoring” – adding a layer of abstraction to the software so that it is relatively easy to change.
Another method is simply to plan to revisit the strategy every 3-12 months.  The agile CEO I interviewed and reported on in a 5-years-old blog post did exactly that – a 5-year plan, revisited and informed with both his outside feedback and the information he gathered by attending scrum meetings.
WTF adds a third dimension:  attempt to discern upcoming technologies and approaches that are “important”, and then “time” the shift to a new strategy incorporating those technologies/approaches.  “Prediction,” in these terms, means anticipating which oncoming technologies/approaches are important and also the pace of their evolution into “timely” products and services.
I would argue, however, that this is precisely where an agile strategy adds value.  It does not assume that what seems important now stays important, or that an important technology/approach will arrive in the market in the next 5 years, but rather that whatever steps we take towards preparing the way for a new technology/approach must be flexible enough to switch to another technology/approach even midway in the process.  For example, we may move towards augmenting our workers with AI, but in such a way that we can instead fully automate one set of workers in order to augment a new type of worker whose responsibilities include that of the old.  We would be, in a sense, “refactoring” the worker-task definition.
So here’s my take from reading WTF:  It should be possible, using WTF’s method of anticipating change, to implement an agile strategy as described.  Moreover, an agile strategy should be clearly better than usual ones.  Usual strategies and agile processes do not anticipate the future; agile strategies such as this do.  WTF-type strategies anticipate the future but are not flexible enough to handle changes between identification of the future and the time for its implementation; an agile strategy should be able to do so.

Sunday, August 19, 2018

Climate Change Mid-2018: The Relatively Good Bad News


Disclaimer:  I am now retired, and am therefore no longer an expert on anything.  This blog post presents only my opinions, and anything in it should not be relied on.

As I have argued before, human metrics on how well we are coping with climate change can be highly misleading, usually on the side of false optimism.  Two metrics that are clearly not thus biased are:

1.       Measurements of atmospheric CO2 at Mauna Loa in Hawaii, which have been recorded since 1959;

2.       Estimates of Arctic sea ice volume (with extent serving as a loose approximation), especially at minimum in September, which have been carried out since the 1980s.

Over the past few years, I have covered the drumbeat of bad news from those two metrics, indicating that we are in a “business as usual” scenario that is accelerating climate change.  In the first half of 2018, what has happened in both cases is that the metrics are not following a “worst possible case” path – hence the “relatively good” part of the title.  At the same time, there is no clearly apparent indication that we are deviating from our “business as usual” scenario – mitigation is not clearly having any effect.  It is possible, however, that we are seeing the beginnings of an effect; it’s just not possible to detect it in the statistical “noise.”  And given that scientists are now talking about a “tipping point” in the near future in which not only a frightening 2 degrees C temperature by 2100 is locked in, but also follow-on feedbacks (like permafrost melt) that take temperature rise eventually to a far more disastrous 3-4 degrees C – well, that’s the underlying, ongoing bad news.
Of course, this summer’s everlasting heat waves in the US, Europe, and the Middle East – heat waves clearly caused primarily by human-generated CO2 emissions and the resulting climate change – make the “new abnormal” obvious to those of us who are not wilfully blind.  But for anyone following the subject with an open mind, the heat waves are not a surprise.  
So let’s take a look at each metric.

The El Nino Effect Recedes


From late 2016 to around June of 2017, the El Nino effect crested, and, as it has done in the past (e.g., 1998) drove both temperatures and the rate of CO2 rise skyward.  Where 2013-2015 saw an unprecedented streak of 3 years of greater than 2 ppm atmospheric CO2 growth, 2016 and 2017 both saw record-breaking growth of around 3 ppm (hiding a brief spurt to almost 4 ppm).  1998 (2.86 ppm) was followed by a year or two of growth around 1 ppm – in fact, slower than 1996-7.  But the percentage rate of rise has also been rising over the years (it reached almost 1% in early 2017, 4 ppm over 404 ppm).  Therefore, it seemed a real possibility that 2018 would see 2.5 ppm growth.  Indeed, we saw 2.5 ppm growth as late as the first month or two of 2018.
Now, however, weekly and monthly growth has settled back to a 1.5-2 ppm rate, consistently since early 1998.  Even a 2 ppm rate gives hope that El Nino did not mean a permanent uptick in the rate of rise.  A 1.5 ppm rate would seem to indicate that 2018 is following the 1999 script – a dip in the rate of rise, possibly because of the follow-on La Nina.  It might even indicate a slight – very slight – decrease in the underlying rate of rise (i.e., the rate of rise with no El Nino or La Nina going on).  And that, as I noted above, is the first indication I have seen that things might possibly be diverging from “business as usual”.  
Of course, there’s always the background of bad news.  In this case, it lies in the fact that whereas ever since I started following CO Mauna Loa 6 or 7 years ago CO2 levels in year 201x were about 10 ppm greater than in year 200x (10 years before), right now CO2 levels are about 13.5 ppm greater than in year 2008.  So, even if the El Nino effect has ended, the underlying amount of rise may still be increasing.  
The best indicator that our efforts are making a difference would be two years of 1 ppm rise or less (CO2 Mauna Loa measures the yearly amount of rise by averaging the Nov.-Feb. monthly rises).  Alas, no such trend has shown up in the data yet.

Arctic Sea Ice:  Not In Stasis, Not in Free Fall


Over the last 2 years, the “new normal” in Arctic sea ice advance and retreat has become apparent.  It involves both unprecedented heat in winter, leading to new low extent maxima, and a cloudy and stormy July and August (key melt months), apparently negating the effects of the winter melt.  However, volume continues to follow a downward overall trend (if far more linear and closer to flat-line than the apparently exponential “free fall” until 2012, which had some predicting “ice-free in 2018”).
As Neven’s Arctic Sea Ice blog (neven1.typepad.com) continues to show, however, “ice-free in September” still appears only a matter of time (at a best guess, according to some statisticians, in the early 2030s).  Subsea temperatures (SSTs) in key parts of the Arctic like above Norway and in the Bering Sea continue to rise and impact sea ice formation in those areas.  As the ice inherited from winter thins, we are beginning to see storms that actually break up the weaker ice into pieces, encouraging increased export of ice to the south via the Fram Strait.  The ice is so thin that a few days ago an icebreaker carrying scientists had to go effectively all the way to the North Pole to find ice thick enough to support their instruments for any length of time.
So the relatively good news is that it appears highly unlikely that this year will see a new low extent, much less an ice-free moment.  The underlying, ongoing bad news is that eventually the rise in SSTs will inevitably overcome the counteracting cloudiness in July and August (and that assumes that the cloudiness will persist).  Since 1980, extent at maximum has shrunk perhaps 12%, while extent at minimum has shrunk perhaps 45% (volume shows sharper decreases).  And in this, unlike CO2 Mauna Loa, there is no trace of a hint that the process is slowing down or reversing due to CO2 emissions reductions.  Nor would we expect there to be such an indication, given that we have only gotten globally serious about emissions reduction in the last 3 years (yes, I recognize that Europe is an exception). 

The Challenge


The question the above analysis raises is:  What will it take to really make a significant impact on our carbon emissions – much less the dramatic reductions scientists have been calling for?  I see no precise answer at the moment.  What I do know is that what we are doing needs to be done even faster, far more extensively – because the last few years have also seen a great increase in understanding on the details of change, as I have tried to show in some of my Reading New Thoughts posts.  The ways are increasingly there; the will is not.  And that, I think, along with countering the disgustingly murderous role of President Trump in particular in climate change (I am thinking of Hurricane Maria and Puerto Rico as an obvious example), should be the main task of the rest of 2018. 

Saturday, August 4, 2018

Reading New Thoughts: Two Books On the Nasty Details of Cutting Carbon Emissions


Disclaimer:  I am now retired, and am therefore no longer an expert on anything.  This blog post presents only my opinions, and anything in it should not be relied on.
I have finally gotten around to talking about two books I recently read, tomes that have greatly expanded my knowledge of the details and difficulties of reducing carbon emissions drastically.  These books are Peter Kalmus; “Being the Change” and David Owen’s “Where the Water Goes”, and I’d like to discuss the new thoughts I believe they give rise to, very briefly.

Kalmus and the Difficulties of Individual Efforts to Cut Carbon Emissions


“Being the Change” is a bit of an odd duck; it’s the personal musings of a physicist dealing with climate change at the level of cross-planet climates, on his personal efforts to reduce his own greenhouse-gas emissions.  Imho, its major value is that it gives perhaps the best explanation I have read on the science of climate change.  However, as promised, it also discusses Kalmus’ careful dissection of his own and his family’s lifestyle in terms of carbon emissions, and his efforts to reduce these emissions as much as possible.
At the start we find out that Kalmus has been successful in reducing his emissions by 90% over the course of a few years, so that they are only 10% of what they were at the start of the effort.  This is significant because many scientists’ recommendations for what is needed to avoid “worst cases” talk about reductions of 80-90% in a time frame of less than 25 years.  In other words, it seems at first glance that a world of individual efforts, if not hindered as they are now by business interests or outdated government regulations, might take us all the way to a carbon-reduced world.
When we look at the details of Kalmus’ techniques, however, it becomes apparent that a major portion of his techniques are not easily reproducible.  In particular, a significant chunk of savings comes from not flying any more; but he was flying more than most, as a scientist attending conferences, so his techniques extended worldwide are more likely to achieve 50-70% emissions reductions, not 80-90%.  Then we add his growing his own food while using “human manure” as manure; and that is something that is far more difficult to reproduce worldwide, given that perhaps 50% of humanity is now in cities and that scavenging human manure is a very time-consuming activity (not to mention borderline illegal is some jurisdictions).  So we lose another 10-20%, for a net reduction of 30-60%, according to my SWAG (look it up).  
The net of it is, to me, that using many of Kalmus’ techniques universally, if it can be done, is very much worth doing; but also changing business practices and adopting government policies and global efforts is necessary, whether we do our individual efforts or not, to achieve the needed drastic reductions in carbon emissions, over a short or a long time period.  There are two pieces of good news here.  First, Kalmus notes that he could have achieved further significant personal reductions if he’d been able to afford a solar-powered home; and that’s something that governments (and businesses) can indeed take a shot at implementing worldwide.  Second, I heard recently that my old junior high’s grade school was now teaching kids about individual carbon footprints ("pawprints") and what to do about them.  Yes, the recommendations were weak tea; but it’s a good start at spreading individual carbon-emissions reductions efforts across society. 

Owen and the Resistance of Infrastructure, Politics, and Law to Emissions Reductions and Sustainability


Nominally, “Where the Water Goes” is about the Colorado River watershed, how its water is allocated, and changes due to the evolution of the economies of neighboring state plus the pressures due to increasing climate-change water scarcity, increased usage from population growth, and the need for sustainability and carbon-emissions reductions.  What stands out about his account, however, is the weird and unexpected permutations of watershed management involved.  Here are a few:

·         The Colorado originally did not provide enough water for mining, except if it was reserved in large chunks for individuals.  As a result, a Law of the River set of water-use rights has grown up in place of the usual “best fair use”, where the older your claim to a certain amount of the water is, the more others whose use of scarce water you pre-empt. 

·         An elaborate system of aqueducts and reservoirs that feed water to cities from Los Angeles to Denver.

·         Rural economies entirely dependent on tourism from carbon-guzzling RVs and jetskis used on man-made lakes.

·         Agriculture that is better in the desert than in fertile areas – because the weather is more predictably good.

·         A food-production system in which supermarket chains and the like now drive agriculture to the point of demanding that individual farmers deliver produce of very specific types, weight ranges, and quality – or else;

·         A mandated cut in water use can lead to real-world water use increase – because now users must use draw more water in low-water-use periods to avoid the risk of running out of their “claimed amount” in a high-use period.

Owen’s take is that it is possible, if people on all sides of the water-scarcity issue (e.g., environmentalists and business) sit down and work things out, to “muddle through” and preserve this strange world by incremental adaptation in a world of increased water scarcity due to climate change, and that crude efforts at quick fixes risk the catastrophic breakdown of the entire system.  My reaction to this is quite different:  to change a carbon-based energy system like the Colorado River is going to take fundamental rethinking, because not only the “sunk cost” infrastructure of aqueducts, reservoirs, and irrigation-fed agriculture, plus rural-industry and state-city politics reinforces the status quo, but the legal system itself – the legal precedents flowing into real-world practices – metastasizes and elaborates the carbon excesses of the system. 
For this particular system, and probably in a lot of cases, I conjecture that the key actors in bringing about carbon reductions are the farmers and the “tourism” industries.  The farmers are key because they in fact use far more water than the cities for their irrigation, and therefore carbon-reduction/sustainability policies that impact them (such as reductions in pesticides, less meat production, or less nitrogen in fertilizers) on top of water restrictions make their job that much harder.  It is hard to see how anything but money (correctly targeted supports and incentives) plus water-use strategies focused on this can overcome both the supermarket control over farmers and these constraints to achieve major carbon-use reductions.  
Meanwhile, the “tourism industries” are key because, like flying as discussed above, they represent an easier target for major reductions in energy and carbon efficiency than cities.  On the other hand, these rural economies are much more fragile, being dependent on low-cost transport/homes in the RV case, and feeding the carbon-related whims of the rich and semi-rich few, in the jetski case.  In the RV case, as in the farmer case, money for less fossil-fuel-consuming RVs and recreation methods will probably avoid major economic catastrophe.
However, I repeat, what is likely to happen if this sort of rethinking does not permeate throughout infrastructure, politics, and the law, is the very major catastrophe that was supposed to be avoided by incrementalism, only in the medium term rather than in the short term, and therefore with greater negative effects.  The tourism industries will be inevitable victims of faster-than-expected, greater-than-expected water shortages and weather destruction.  The farmers will be victims of greater-than-expected, faster-than-expected water evaporation from heat and weather destruction.  The cities will be the victims of resulting higher food prices and shortages.  
What Owen’s book does is highlight just how tough some of the resistance “built into the system” to carbon-emissions reductions is.  What it does not do is show that therefore incrementalism is preferable.  On the contrary.

Wednesday, July 25, 2018

Reading New Thoughts: Haskel and Westlake’s Capitalism Without Capital, and the Distorting Rise of Intangible Assets


Disclaimer:  I am now retired, and am therefore no longer an expert on anything.  This blog post presents only my opinions, and anything in it should not be relied on.
In my view, Haskel/Westlake’s “Capitalism Without Capital” is not so much an argument that the increasing importance of “intangible assets” constitutes a new and different “intangible economy”, as strong evidence that the ever-increasing impact of software means that a fundamental idea of economics – that everything can be modeled as a mass of single-product manufacturers and industries – is farther and farther from the real world.  As a result, I would argue, measures of the economy and our well-being based on those assumptions are increasingly distorted.  And we need to do more than tweak our accounting to reflect this “brave new world”.  
First, my own brief “summary” of what Haskel/Westlake say.  They start by asserting that present-day accounting does not count intangible company investments like software during development, innovation property such as patents and R&D, and “economic competencies” such as training, branding, and business-process research.  In the case of software development, for example, instead of inventory that is capitalized and whose value is represented at cost until sold, we typically have expenses but no capitalized value right up until the software is released and sold.  A software company, therefore, with its continual development cycle, appears to have zero return on investment on a lot of its product portfolio.
Haskel/Westlake go on to argue that a lot of newer companies are more and more like software companies, in that they predominantly depend on these “intangible assets.”  The new breed of company, they say, has key new features:

1.       “sunk costs” – that is, you can’t resell development, research, or branding to get at its monetary value to you.

2.       “spillovers” – it is exceptionally easy for others to use your development-process insights and research.

3.       “scalability” – it requires relatively little effort and cost to scale usage of these intangible assets from a thousand to a million to a billion end users.

4.       “synergies” – research in various areas, software infrastructure, and business-process skills complement each other, so that the whole is more than the sum of the value-added of the parts.

5.       “uncertainty” – compared to, say, a steel manufacturing firm, the software company has far more potential upside from its investments, and often far more potential downside.

6.       “contestedness” – such a company faces much greater competition for control of its assets, particularly since they are so easy to use by others.

Finally, Haskel/Westlake say that, given their assumption that “intangible companies” make up a significant and growing part of the global economy, they already have significant impacts on that economy in particular areas:

·         “Secular stagnation” over the last decade is partially ascribed to the increasing undervaluing of these companies’ “intangible assets”.

·         Intangible companies increase income inequality because they function best with physical communication by specialized managers in cities.

·         Intangible companies are under-funded, because banks are not well suited to investing without physical capital to repossess and resell.  Haskel/Westlake suggests that greater use of equity rather than loans is required, and may be gotten from institutional investors and by funding collaborating universities.

·         Avoiding too much failure will require new business practices as well as new government encouragement, e.g., via better support for in-business control of key intangible assets (clear intangible-asset ownership rules) or supporting the new methods of financing the “intangible companies.”

It’s the Software, Sirs


Let’s look at it a different way.  Today’s theories of economics grew out of a time (1750-1850) when large-scale manufacturing was on the rise, and its microeconomics reflects that, as does the fact that data on economic performance (e.g., income) comes from surveys of businesses, which is then “adjusted” to try to include non-business data (trade and reconciling with personal income reports).  From 1750-about 1960, manufacturing continued to increase as a percentage of overall economic activity and employment, at the expense of farming.  From 1960 or so, “services” (ranging from hospitals to concierges) began to carve into that dominance, but all those services, in terms of jobs, could still be cast in the mold of “corporation that is mostly workers producing/dealing with customers, plus physical infrastructure/capital”. 
Now consider today’s typical large software-driven company.  Gone is the distinction between line and staff.  Manufacturing has shrunk dramatically as a share of economic activity, both within the corporation and overall.  Bricks and mortar is shrinking.  Jobs are much more things like developers (development Is not manufacturing nor engineering but applied math), marketers/branders, data scientists (in my mind, a kind of developer), help desk, Internet presence support.  The increased popularity of “business agility” goes along with shorter careers at a particular company, outsourcing, intra-business “services” that are primarily software (“platform as a service”, Salesforce.com).  Success is defined as control over an Internet/smartphone software-related bottleneck like goods-ordering (Amazon), advertising (Google), or “apps” (Apple). 
Now consider what people are buying from these software-driven firms.  I would argue that it differs in two fundamental ways from “manufacturing” and old-style “services”:

1.       What is bought is more abstract, and therefore applicable to a much wider range of products.  You don’t just buy a restaurant meal; you buy a restaurant-finding app.  You don’t just browse for books; you browse across media.  What you are selling is not a widget or a sweater, as in economics textbooks, but information or an app.

2.       You can divide up the purposes of buying into (a) Do (to get something done); (b)  Socialize/Communicate, as in Facebook and Pinterest; and (c) Learn/Create, as in video gaming and blog monetization.  The last two of these are really unlike the old manufacturing/services model, and their share of business output has already increased to a significant level.  Of course, most if not all software-driven companies derive their revenues from a mix of all three.

The result of all this, in economic terms, is complexity, superficially masked by the increased efficiency.  Complexity for the customer, who narrows his or her gaze to fewer companies after a while.  Complexity for the business, whose path to success is no longer as clear as cutting costs amid stable strategies – so the company typically goes on cutting costs and hiring and firing faster or outsourcing more in default of an alternative.  Complexity for the regulator, whose ability to predict and control what is going on is undercut by such fast-arriving devices as “shadow banking”, information monopolies, and patent trolling.
In other words, what I am arguing for is a possible rethinking of macroeconomics in terms of different microeconomic foundations, not the ones of behavioral economics, necessarily, but rather starting from the viewpoint “what is really going on inside the typical software-driven corporation” and then asking how such a changed internal world will reflect back to the overall economy, how macroeconomic data can capture what is going on, and how one can use the new data to regulate and anticipate future problems better.
John LeCarre once said that the key problem post-Cold-War was how to handle the “wrecking infant” – here he was referencing an amoral businessman creating Third-World havoc, although you can translate that to the situation in the US right now.  The software-driven business, in terms of awareness of what to do, is a bit of a wrecking infant.  If it isn’t helped to grow up, the fault will not lie solely in our inability to anticipate the distorting rise of intangible assets, its side-effect, but also in our failure to deal adequately with its abstraction, new forms of organization and revenue, and complexity.

Wednesday, July 4, 2018

There’s Something Wrong Here: July 4, 2018


For the first time in polling history (since 2000), less than half of Americans say they are “extremely proud” of being an American.  Before Donald J. Trump became the Republican candidate for Presidency, that figure had never gone below 54 % (“extremely proud”); it is now 47 %.  The figures show equivalent declines among Democrats and “independents”, but a slight uptick among Republicans.
The conclusion is that President Trump makes a net 7 % of the country more ashamed to be an American.  I cannot find another President of whom historians say that he made a large proportion of Americans more ashamed of their nationality. 

Monday, June 11, 2018

Reading New Thoughts: Hessen’s Many Lives of Carbon and the Two Irresistible Forces of Climate Change


Disclaimer:  I am now retired, and am therefore no longer an expert on anything.  This blog post presents only my opinions, and anything in it should not be relied on.
Dag Hessen’s “The Many Lives of Carbon” is the best book I have been able to find on the chemical details of the carbon cycle and CO2-driven climate change (despite some odd idioms that make it difficult to understand him at times).  In particular, it goes deeply into “positive feedbacks” that exacerbate departures from atmospheric-O2 equilibrium and “negative feedbacks” that operate to revert to equilibrium.  In the long run, negative feedbacks win; but the long run can be millions of years.
More precisely, if humans weren’t involved, there are medium-term and long-term “global warmings” and coolings.  The medium-term warming/cooling is usually thought to result from the Milankovitch Cycle, in which Earth orbit unusually far from the Sun during winter (“rubber-banding” of its elliptical orbit), a more severe tilt of the Earth’s axis as it oscillates periodically, and “precession” (wobbling back and forth on its axis) combine to yield unusually low Northern-Hemisphere winter heat from the Sun, which kickstarts glaciation, which acts as a positive feedback for global cooling along with atmospheric CO2 loss.  This cooling operates slowly over most of the next 100,000 years to descend into an Ice Age, but then the opposite effect (maximum heat from the sun) kicks in and brings a sharp rise in temperatures back to its original point. 
Long-term global warming is much rarer – there are instances 55 million years ago and 250 million years ago, and indeed some indications that most of the previous 5 largest mass extinctions on Earth were associated with this type of global warming.  The apparent cause is massive volcanism, especially underwater volcanism (the clouds from on-land volcanism actually reduce temperatures significantly over several years, but then the temperatures revert to what they were before the eruption).  It also seems clear that the main if not only cause of the warming is CO2 and methane (CH4) released into the atmosphere by these eruptions – carbon infusions so persistent that the level of atmospheric CO2 did not revert to equilibrium for 50 million years or so after the last such warming.
For both medium-term and long-term global warming/cooling, “weathering” acts as a negative feedback – but only over hundreds of thousands of years.  Weathering involves water and wind wearing away at rock, exposing carbon-infused silica that are then carried to the ocean.  There, among other outcomes, they are taken up by creatures who die, forming ocean-floor limestone that “captures” the carbon and thus withdraws it from the carbon cycle circulating carbon into and out of the atmosphere.
Human-caused global warming takes the slow negative feedback of animal/plant fossils being turned into coal, oil, and natural gas below the Earth’s surface and makes it into an unprecedentedly fast direct cause of global warming, mostly by burning it for fuel (hence “fossil fuels”).   The net effect of CO2 doubling in the atmosphere is 2-2.8 degrees Centigrade land-temperature warming, but it is also accompanied by positive feedbacks (including increased cloud cover, increased atmospheric methane, and “black carbon”/soot decreases in albedo [reflectivity of the sun’s heat]) that, in a slower way, may take the net global warming associated with CO2 doubling up to 4 degrees C.  Some of this can be overcome by the negative feedback of the ocean’s slower absorption of the increased heat, as the ocean “sink” can take more carbon out of the atmosphere, but at some point soon the ocean will be in balance with the atmosphere and much of what we emit in carbon pollution will effectively stay aloft for thousands of years. 
My point in running through the implications of Hessen’s analysis is that there is a reason for scientists to fear for all of our lives:  The global warming that, in the extreme, disrupts agriculture and food webs extremely and makes it unsafe for most of us to operate outside for more than a short period of time most of the year, except in the high mountains, is a CO2-driven “irresistible force”.  There is therefore a good case to be made that “mitigation” – reducing carbon emissions (and methane and black carbon and possibly nitrous oxide) is by far the best way to avoid the kind of global warming that literally threatens humanity’s survival.  And this is a point that I have argued vociferously in past blog posts.

The Other Irresistible Force:  The Business/Legal Bureaucracy


And yet, based on my reading, I would argue that there is an apparently equally irresistible force operating on its own momentum in today’s world:  what I am calling the Business/Legal Bureaucracy.  Here, I am using the word “bureaucracy” in a particular sense:  that of an organization operating on its own momentum.  In the case of businesses, that means a large-business bureaucracy operating always under a plan to make more profit this year than last.  In the case of the law, that means a court system and mindset always to build on precedents and to support property rights.  
To see what I am talking about, consider David Owen’s “Where the Water Goes”, about what happens to all the water in the Colorado River watershed.  Today, most of that water is shipped east to fuel the farms and businesses of Colorado and its vicinity, or west, to meet the water needs of Las Vegas and Los Angeles and the like.  The rest is allocated to farmers or other property owners along the way under a peculiar legal doctrine called “prior appropriation” – instead of the more typical equal sharing among bordering property owners, it limits each portion to a single “prior claimant”, since in the old mining days there wasn’t enough water in each portion for more than one miner to be able to “wash” the gold effectively.  Each new demand for water therefore is fitted legally within this framework, ensuring that the agricultural business bureaucracy will push for more water from the same watershed, while the legal bureaucracy will accommodate this within the array of prior claimants.
To see what creates such an impression of an irresistible force about this, consider the need to cut back on water use as climate change inevitably decreases the snowpack feeding the Colorado.  As Owen points out, there is no clear way to do this.  Business-wise, no one wants to be the one to sacrifice water.  Legally, the simple solution of limiting per-owner water use can actually result in more water use from each owner, as seasonal and annual variations in water needs mean that many owners will now need to draw down and store water in off-seasons “just in case”.   The result is a system that not only resists “sustainability” but in fact can also be less profit-producing in the long run – the ecosystems shafted by this approach, such as the now-dry Mexican terminus of the Colorado, may well be those that might have been most arable in the global-warming future.  And the examples of this multiply quickly:  the wildfire book I reviewed in an earlier blog post noted the extreme difficulties in fighting wildfires with their exacerbating effect on global warming, difficulties caused by the encroachment of mining and real-estate development on northern forests, driven by Legal/Business Bureaucracy.
The primary focus of the Legal/Business Bureaucracy with respect to climate change and global warming, therefore, is after-the-fact, incremental adaptation.  It is for that reason, as well as the dangerous trajectory we are now on, that I view calling the most disastrous future climate-change scenarios “business as usual” as entirely appropriate.

Force, Meet Force


It also seems to me that reactions to the most dire predictions of climate change fall into two camps (sometimes in the same person!): 

1.       We need drastic change or few of us will survive, because no society or business system can possibly resist the upcoming “business as usual” changes:  extreme weather, loss of water for agriculture, loss/movement of arable land, large increases in the area ripe for debilitating/killing tropical diseases, extreme heat, loss of ocean food, loss of food-supporting ecosystems, loss of seacoast living areas, possible toxic emissions from the ocean. 

2.       Our business-economy-associated present system will somehow automagically “muddle through”, as it always seems to have done.  After all, there is, it seems, plenty of “slack” in the water efficiency of agriculture, plenty of ideas about how to adopt businesses and governments to encourage better adaptation and mitigation “at the margin” (e.g., solar now dominates the new-energy market in the United States, although it does not seem to have made much of a dent in the existing uses of fossil fuels), and plenty of new business-associated technologies to apply to problems (e.g., business sustainability metrics). 

An interesting example of how both beliefs can apparently coexist in the same person is Steven Pinker’s “Enlightenment Now”, an argument that the “reason reinforced by science” approach to the world of the late 18th century known as the Enlightenment is primarily responsible for a dramatic, continuing improvement in the overall human situation, and should be chosen as the primary impetus for further improvements.  My overall comment on this book is that Pinker has impressed me with the breadth of his reading and the general fairness of his assessments of that reading.  However, Pinker appears to have one frustrating flaw:  A belief that liberals and other proponents of change in such areas as equality, environmentalism, and climate change are primarily so ideology-driven that they are actually harmful to Enlightenment-type improvements, and likewise their proponents within the government.  Any reasonable reading of Jeffrey Sachs’ “The Age of Sustainable Development”, which I hope to discuss in a later post, puts the lie to that one. 
In any case, Pinker has an extensive section on climate change, in which he both notes the “existential threat” (i.e., threat to human existence) posed by human-caused climate change, and asserts his belief that it can be overcome by a combination of Business/Legal-Complex adaptation to the findings of Enlightenment scientists and geoengineering.  One particular assertion is that if regulations could be altered, new technologies in nuclear power can meet all our energy needs in short order, with little risk to people and little need of waste storage.  I should note that James Hansen appears to agree with him (although Hansen is far more pessimistic that regulation alteration will happen or that nuclear businesses will choose to change their models) and Joe Romm very definitely does not.  
One also often sees the second camp among experts on Business/Legal-Complex matters such as allocation of water in California in reaction to climate-change-driven droughts.  These reactions assume linear effects on water availability of what is an exponential global-warming process, and note that under these assumptions, there is plenty of room for less water use by existing agriculture, and everyone should “stop panicking.” 
So what do I think will happen when force meets force?

Flexibility Is the Enemy of Agility


One of the things that I noticed (and noted in my blog) in my past profession is that product and organizational flexibility – the ability to easily repurpose for other needs and uses – is a good thing in the short run, but often a bad thing in the long run.  How can this be?  Because the long run will often require fundamental changes to products and/or organizations, and the more that the existing product or system is “patched” to meet new needs, the greater the cultural, organizational, and experiential investment in the present system – not to mention the greater the gap between that and what’s going on elsewhere that will eventually lead to the need for a fundamental change.
There is one oncoming business strategy that reduces the need for such major, business-threatening transitions:  business agility.  Here the idea is to build both products and organizations with the ability for much more radical change, plus better antennae to the outside to stay as close as possible to changes in consumer needs.  But flexibility is in fact the enemy of agility:  patches in the existing product or organization exacerbate the “hard-coded” portions of it, making fundamental changes more unlikely and the huge costs of an entire do-over more inevitable.
As you can guess, I think that today’s global economy is extraordinarily flexible, and that is what camp 2 counts on to save the day.  But this very flexibility is the enemy of the agility that I believe we will find ourselves needing, if we are to avoid this kind of collapse of the economy and our food production.
But it’s a global economy – that’s part of its global Business/Legal-Bureaucracy flexibility.  So local or even regional collapses aren’t enough to do it.  Rather, if we fail to mitigate strongly, e.g., in an agile fashion, and create a sustainable global economy over the next 20 years, some time over the 20 years after that the average costs of increasing stresses from climate change put the global economy in permanent negative territory, and, unless fundamental change happens immediately after that, the virtuous circle of today’s economy turns into a vicious, accelerating circle of economic collapse.  And the smaller our economies become, the less we wind up spending on combating the ever-increasing effects of climate change.  At the end, we are left with less than half (and maybe 1/10th) the food sources and arable land, much of it in new extreme-northern areas with soil of lower quality, with food being produced under much more brutal weather conditions.  Or, we could by then have collapsed governmentally to such a point that we can’t even manage that.  It’s an existential threat.

Initial Thoughts On What to Do


Succinctly, I would put my thoughts under three headings:

1.       Governmental.  Governments must drive mitigation well beyond what they have done up to now in their Paris-agreement commitments.  We, personally, should place climate change at the top of our priority lists (where possible), reward commitments and implementation politically, and demand much more.

2.       Cultural.  Businesses, other organizations, and the media should be held accountable for promoting, or failing to promote, mitigation-targeted governmental, organizational, and individual efforts.  In other words, we need to create a new lifestyle.  There’s an interesting take on what a lifestyle like that would look like in Peter Kalmus’ “Being the Change”, which I hope to write about soon.

3.       Personal.  Simply as a matter of not being hypocrites, we should start to change our carbon “consumption” for the better.  Again, Kalmus’ book offers some suggestions.

Or, as Yoda would put it, “Do or do not.  There is no try.”  Because both Forces will be against you.